Handoff Operator
Overview
Section titled “Overview”The Handoff
operator acts as an intelligent router for conversations. It uses a configured Language Model (LLM) to analyze the ongoing discussion (provided via the input_table
) and decide which specialized Agent
operator, defined in its Agents
sequence, is best suited to handle the next step.
This allows for dynamic and sophisticated workflows where different AI assistants can contribute based on their specific expertise or assigned roles. When handoff is disabled, it simply passes the conversation to a manually selected agent.
Parameters
Section titled “Parameters”Parameters are organized by page.
Handoff Page
Section titled “Handoff Page”op('handoff').par.Call
Pulse - Default:
None
op('handoff').par.Enablehandoff
Toggle - Default:
On
- Options:
- Off, On
op('handoff').par.Active
Toggle - Default:
None
- Options:
- Off, On
op('handoff').par.Status
Str - Default:
None
op('handoff').par.Reason
Str - Default:
None
op('handoff').par.Agents
Sequence - Default:
None
Model Page
Section titled “Model Page”This page configures the LLM used by the Handoff operator itself to make the routing decisions when Enable Handoff
is active. These settings do not affect the models used by the individual Agent
operators defined in the sequence.
Understanding Model Selection
Operators utilizing LLMs (LOPs) offer flexible ways to configure the AI model used:
- ChatTD Model (Default): By default, LOPs inherit model settings (API Server and Model) from the central
ChatTD
component. You can configureChatTD
via the "Controls" section in the Operator Create Dialog or its parameter page. - Custom Model: Select this option in "Use Model From" to override the
ChatTD
settings and specify theAPI Server
andAI Model
directly within this operator. - Controller Model: Choose this to have the LOP inherit its
API Server
andAI Model
parameters from another operator (like a different Agent or any LOP with model parameters) specified in theController [ Model ]
parameter. This allows centralizing model control.
The Search toggle filters the AI Model
dropdown based on keywords entered in Model Search
. The Show Model Info toggle (if available) displays detailed information about the selected model directly in the operator's viewer, including cost and token limits.
Available LLM Models + Providers Resources
The following links point to API key pages or documentation for the supported providers. For a complete and up-to-date list, see the LiteLLM provider docs.
op('handoff').par.Maxtokens
Int The maximum number of tokens the model should generate.
- Default:
4096
op('handoff').par.Temperature
Float Controls randomness in the response. Lower values are more deterministic.
- Default:
0.7
op('handoff').par.Modelcontroller
OP Operator providing model settings when 'Use Model From' is set to controller_model.
- Default:
None
op('handoff').par.Search
Toggle Enable dynamic model search based on a pattern.
- Default:
off
- Options:
- off, on
op('handoff').par.Modelsearch
Str Pattern to filter models when Search is enabled.
- Default:
"" (Empty String)
About Page
Section titled “About Page”op('handoff').par.Bypass
Toggle - Default:
Off
- Options:
- Off, On
op('handoff').par.Showbuiltin
Toggle - Default:
Off
- Options:
- Off, On
op('handoff').par.Version
Str - Default:
None
op('handoff').par.Lastupdated
Str - Default:
None
op('handoff').par.Creator
Str - Default:
None
op('handoff').par.Website
Str - Default:
None
op('handoff').par.Chattd
OP - Default:
None
Usage Examples
Section titled “Usage Examples”Simple Routing
Section titled “Simple Routing”- Add two or more
Agent
OPs to your network, each configured with a different System Prompt reflecting their specialty (e.g., one for creative writing, one for technical support). - In the
Handoff
OP’sAgents
sequence, add blocks linking to eachAgent
OP. - Provide clear names in the
Rename
parameter for each agent (e.g., “Creative Writer”, “Support Bot”). EnsureInclude System
is On. - Connect your input conversation DAT (e.g., from a
Chat
OP) to theHandoff
OP’s input. - Ensure
Enable Handoff
is On and theModel
page is configured with an LLM capable of function calling (like GPT-4, Claude 3, Gemini). - Pulse
Call Agents
. TheHandoff
OP will use its LLM to analyze the last message and the agent descriptions/system prompts, then route the conversation by calling the appropriateAgent
. - The
Reason
parameter will show the LLM’s decision rationale.
Manual Agent Selection
Section titled “Manual Agent Selection”- Configure the
Agents
sequence as above. - Turn
Enable Handoff
toOff
. - Select the desired target agent manually using the
Current Agent
menu. - Pulse
Call Agents
. The conversation will be sent directly to the selected agent without LLM intervention.
Technical Notes
Section titled “Technical Notes”- The quality of the handoff decision heavily depends on the capability of the LLM selected on the
Model
page and the clarity of theRename
strings and includedSystem Prompts
for each agent in the sequence. - Ensure the LLM used for handoff supports function calling / tool use, as the routing mechanism relies on this.
- The
Handoff
operator manages the conversation flow but relies on the individualAgent
operators to generate the actual responses. - The
conversation_dat
viewer shows the state before the final agent response is added. The agent’s response is handled asynchronously via callbacks.