Chat Operator
Overview
Section titled “Overview”The Chat LOP operator facilitates the creation, editing, and management of conversations with AI models within TouchDesigner. It allows users to define the roles (user, assistant, system) and content of messages, control the flow of the conversation, and integrate with other LOPs for advanced AI workflows. This operator is particularly useful for prototyping conversational AI agents, setting up example conversations, and enforcing specific patterns for the LLM to follow.
Requirements
Section titled “Requirements”- Requires the ChatTD LOP to be present in the network, as it handles the actual API calls to the AI model.
- No specific Python dependencies beyond those required by TouchDesigner and the ChatTD LOP.
Input/Output
Section titled “Input/Output”Inputs
Section titled “Inputs”- Input Table (DAT, optional): A table DAT containing pre-existing conversation data with columns for
role
,message
,id
, andtimestamp
. Allows loading conversations from external sources.
Outputs
Section titled “Outputs”- Conversation DAT: A table DAT named
conversation_dat
that stores the current state of the conversation, including roles, messages, IDs, and timestamps.
Parameters
Section titled “Parameters”Messages Page
Section titled “Messages Page”op('chat').par.Active
Toggle - Default:
Off
op('chat').par.Callassistant
Pulse - Default:
None
op('chat').par.Calluser
Pulse - Default:
None
op('chat').par.Insertindex
Integer - Default:
1
op('chat').par.Message
Sequence - Default:
None
op('chat').par.Message0text
String - Default:
None
op('chat').par.Message1text
String - Default:
None
Model Page
Section titled “Model Page”Understanding Model Selection
Operators utilizing LLMs (LOPs) offer flexible ways to configure the AI model used:
- ChatTD Model (Default): By default, LOPs inherit model settings (API Server and Model) from the central
ChatTD
component. You can configureChatTD
via the "Controls" section in the Operator Create Dialog or its parameter page. - Custom Model: Select this option in "Use Model From" to override the
ChatTD
settings and specify theAPI Server
andAI Model
directly within this operator. - Controller Model: Choose this to have the LOP inherit its
API Server
andAI Model
parameters from another operator (like a different Agent or any LOP with model parameters) specified in theController [ Model ]
parameter. This allows centralizing model control.
The Search toggle filters the AI Model
dropdown based on keywords entered in Model Search
. The Show Model Info toggle (if available) displays detailed information about the selected model directly in the operator's viewer, including cost and token limits.
Available LLM Models + Providers Resources
The following links point to API key pages or documentation for the supported providers. For a complete and up-to-date list, see the LiteLLM provider docs.
op('chat').par.Maxtokens
Int The maximum number of tokens the model should generate.
- Default:
256
op('chat').par.Temperature
Float Controls randomness in the response. Lower values are more deterministic.
- Default:
0
op('chat').par.Modelcontroller
OP Operator providing model settings when 'Use Model From' is set to controller_model.
- Default:
None
op('chat').par.Search
Toggle Enable dynamic model search based on a pattern.
- Default:
off
- Options:
- off, on
op('chat').par.Modelsearch
Str Pattern to filter models when Search is enabled.
- Default:
"" (Empty String)
Conversation Page
Section titled “Conversation Page”op('chat').par.Usesystemmessage
Toggle - Default:
On
op('chat').par.Systemmessage
String - Default:
helpful TD assistant
op('chat').par.Useuserprompt
Toggle - Default:
On
op('chat').par.Userprompt
String - Default:
pretend to be the user
op('chat').par.Clearconversation
Pulse - Default:
None
op('chat').par.Loadfrominput
Pulse - Default:
None
op('chat').par.Conversationid
String - Default:
chat1
Callbacks Page
Section titled “Callbacks Page”op('chat').par.Callbackdat
DAT - Default:
None
op('chat').par.Editcallbacksscript
Pulse - Default:
None
op('chat').par.Createpulse
Pulse - Default:
None
op('chat').par.Ontaskstart
Toggle - Default:
Off
op('chat').par.Ontaskcomplete
Toggle - Default:
Off
About Page
Section titled “About Page”op('chat').par.Showbuiltin
Toggle - Default:
Off
op('chat').par.Bypass
Toggle - Default:
Off
op('chat').par.Chattd
OP - Default:
/dot_lops/ChatTD
op('chat').par.Version
String - Default:
1.0.0
op('chat').par.Lastupdated
String - Default:
2024-11-06
op('chat').par.Creator
String - Default:
dotsimulate
op('chat').par.Website
String - Default:
https://dotsimulate.com
Callbacks
Section titled “Callbacks”onTaskStart
onTaskComplete
def onTaskStart(info):
# Called when a call to the assistant or user begins
# info dictionary contains details like op, callType
pass
def onTaskComplete(info):
# Called when the AI response is received and processed
# info dictionary contains details like op, result, conversationID
pass
Performance Considerations
Section titled “Performance Considerations”- Long conversations (many message blocks) can increase processing time.
- Higher temperature settings might slightly increase AI response time.
- Ensure the linked ChatTD operator is configured correctly.
Usage Examples
Section titled “Usage Examples”Basic Conversation
Section titled “Basic Conversation”# Get the operatorchat_op = op('chat1')
# Configure messages (assuming default 5 blocks)chat_op.par.Message0role = 'system'chat_op.par.Message0text = 'You are a pirate.'chat_op.par.Message1role = 'user'chat_op.par.Message1text = 'Hello there!'chat_op.par.Message2role = 'assistant'chat_op.par.Message2text = '' # Assistant will fill this
# Call the assistantchat_op.par.Callassistant.pulse()
# Check the output DATconversation_log = chat_op.op('conversation_dat')print(conversation_log.rows()[-1]) # Print the last message (assistant's response)
Loading from Input Table
Section titled “Loading from Input Table”# Assume 'conv_table' DAT exists with role, message columns
# Connect tablechat_op = op('chat1')chat_op.inputConnectors[0].connect(op('conv_table'))
# Load datachat_op.par.Inputhandling = 'none' # Important if you want exact loadchat_op.par.Loadfrominput.pulse()
Common Use Cases
Section titled “Common Use Cases”- Prototyping conversational AI agents.
- Building example conversations for training or demos.
- Enforcing specific response patterns or personas.
- Integrating chat interfaces into installations.
- Creating dynamic user experiences.