Skip to content

Handoff Operator

The Handoff operator acts as an intelligent router for conversations. It uses a configured Language Model (LLM) to analyze the ongoing discussion (provided via the input_table) and decide which specialized Agent operator, defined in its Agents sequence, is best suited to handle the next step.

This allows for dynamic and sophisticated workflows where different AI assistants can contribute based on their specific expertise or assigned roles. When handoff is disabled, it simply passes the conversation to a manually selected agent.

Handoff Operator UI

Call Agents (Call) op('handoff').par.Call Pulse
Default:
None
Enable Handoff (Enablehandoff) op('handoff').par.Enablehandoff Toggle
Default:
false
Current Agent (Currentagent) op('handoff').par.Currentagent Menu
Default:
None
Active (Active) op('handoff').par.Active Toggle
Default:
None
Status (Status) op('handoff').par.Status Str
Default:
None
Reason (Reason) op('handoff').par.Reason Str
Default:
None
Agents (Agents) op('handoff').par.Agents Sequence
Default:
None
Display (Display) op('handoff').par.Display Menu
Default:
conversation_dat
Options:
conversation_dat, handoff_history

This page configures the LLM used by the Handoff operator itself to make the routing decisions when Enable Handoff is active. These settings do not affect the models used by the individual Agent operators defined in the sequence.

Output Settings (Outputsettings) op('handoff').par.Outputsettings
Default:
None
Max Tokens (Maxtokens) op('handoff').par.Maxtokens
Default:
2048
Temperature (Temperature) op('handoff').par.Temperature
Default:
0.7
Model Selection (Modelselectionheader) op('handoff').par.Modelselectionheader
Default:
None
Use Model From (Modelselection) op('handoff').par.Modelselection
Default:
chattd_model
Controller [ Model ] (Modelcontroller) op('handoff').par.Modelcontroller
Default:
None
Select API Server (Apiserver) op('handoff').par.Apiserver
Default:
openrouter
AI Model (Model) op('handoff').par.Model
Default:
llama-3.2-11b-vision-preview
Search (Search) op('handoff').par.Search
Default:
false
Model Search (Modelsearch) op('handoff').par.Modelsearch
Default:
None
Show Model Info (Showmodelinfo) op('handoff').par.Showmodelinfo
Default:
false
Bypass (Bypass) op('handoff').par.Bypass Toggle
Default:
false
Show Built-in Parameters (Showbuiltin) op('handoff').par.Showbuiltin Toggle
Default:
false
Version (Version) op('handoff').par.Version Str
Default:
None
Last Updated (Lastupdated) op('handoff').par.Lastupdated Str
Default:
None
Creator (Creator) op('handoff').par.Creator Str
Default:
None
Website (Website) op('handoff').par.Website Str
Default:
None
ChatTD Operator (Chattd) op('handoff').par.Chattd OP
Default:
None
  1. Add two or more Agent OPs to your network, each configured with a different System Prompt reflecting their specialty (e.g., one for creative writing, one for technical support).
  2. In the Handoff OP’s Agents sequence, add blocks linking to each Agent OP.
  3. Provide clear names in the Rename parameter for each agent (e.g., “Creative Writer”, “Support Bot”). Ensure Include System is On.
  4. Connect your input conversation DAT (e.g., from a Chat OP) to the Handoff OP’s input.
  5. Ensure Enable Handoff is On and the Model page is configured with an LLM capable of function calling (like GPT-4, Claude 3, Gemini).
  6. Pulse Call Agents. The Handoff OP will use its LLM to analyze the last message and the agent descriptions/system prompts, then route the conversation by calling the appropriate Agent.
  7. The Reason parameter will show the LLM’s decision rationale.
  1. Configure the Agents sequence as above.
  2. Turn Enable Handoff to Off.
  3. Select the desired target agent manually using the Current Agent menu.
  4. Pulse Call Agents. The conversation will be sent directly to the selected agent without LLM intervention.
  • The quality of the handoff decision heavily depends on the capability of the LLM selected on the Model page and the clarity of the Rename strings and included System Prompts for each agent in the sequence.
  • Ensure the LLM used for handoff supports function calling / tool use, as the routing mechanism relies on this.
  • The Handoff operator manages the conversation flow but relies on the individual Agent operators to generate the actual responses.
  • The conversation_dat viewer shows the state before the final agent response is added. The agent’s response is handled asynchronously via callbacks.