Chat Operator
The Chat LOP operator facilitates the creation, editing, and management of conversations with AI models within TouchDesigner. It allows users to define the roles (user, assistant, system) and content of messages, control the flow of the conversation, and integrate with other LOPs for advanced AI workflows. This operator is particularly useful for prototyping conversational AI agents, setting up example conversations, and enforcing specific patterns for the LLM to follow.
Requirements
Section titled “Requirements”- Requires the ChatTD LOP to be present in the network, as it handles the actual API calls to the AI model.
- No specific Python dependencies beyond those required by TouchDesigner and the ChatTD LOP.
Input/Output
Section titled “Input/Output”Inputs
Section titled “Inputs”- Input Table (DAT, optional): A table DAT containing pre-existing conversation data with columns for
role
,message
,id
, andtimestamp
. Allows loading conversations from external sources.
Outputs
Section titled “Outputs”- Conversation DAT: A table DAT named
conversation_dat
that stores the current state of the conversation, including roles, messages, IDs, and timestamps.
Parameters
Section titled “Parameters”Messages Page
Section titled “Messages Page”op('chat').par.Active
Toggle - Default:
Off
op('chat').par.Callassistant
Pulse - Default:
None
op('chat').par.Calluser
Pulse - Default:
None
op('chat').par.Insertindex
Integer - Default:
1
op('chat').par.Message
Sequence - Default:
None
op('chat').par.Message0text
String - Default:
None
op('chat').par.Message1text
String - Default:
None
Model Page
Section titled “Model Page”Understanding Model Selection
Operators utilizing LLMs (LOPs) offer flexible ways to configure the AI model used:
- ChatTD Model (Default): By default, LOPs inherit model settings (API Server and Model) from the central
ChatTD
component. You can configureChatTD
via the "Controls" section in the Operator Create Dialog or its parameter page. - Custom Model: Select this option in "Use Model From" to override the
ChatTD
settings and specify theAPI Server
andAI Model
directly within this operator. - Controller Model: Choose this to have the LOP inherit its
API Server
andAI Model
parameters from another operator (like a different Agent or any LOP with model parameters) specified in theController [ Model ]
parameter. This allows centralizing model control.
The Search toggle filters the AI Model
dropdown based on keywords entered in Model Search
. The Show Model Info toggle (if available) displays detailed information about the selected model directly in the operator's viewer, including cost and token limits.
Available LLM Models + Providers Resources
The following links point to API key pages or documentation for the supported providers. For a complete and up-to-date list, see the LiteLLM provider docs.
op('chat').par.Maxtokens
Int The maximum number of tokens the model should generate.
- Default:
256
op('chat').par.Temperature
Float Controls randomness in the response. Lower values are more deterministic.
- Default:
0
op('chat').par.Modelcontroller
OP Operator providing model settings when 'Use Model From' is set to controller_model.
- Default:
None
op('chat').par.Search
Toggle Enable dynamic model search based on a pattern.
- Default:
off
- Options:
- off, on
op('chat').par.Modelsearch
Str Pattern to filter models when Search is enabled.
- Default:
"" (Empty String)
Conversation Page
Section titled “Conversation Page”op('chat').par.Usesystemmessage
Toggle - Default:
Off
op('chat').par.Systemmessage
String - Default:
"" (Empty String)
op('chat').par.Useuserprompt
Toggle - Default:
Off
op('chat').par.Userprompt
String - Default:
"" (Empty String)
op('chat').par.Clearconversation
Pulse - Default:
None
op('chat').par.Loadfrominput
Pulse - Default:
None
op('chat').par.Conversationid
String - Default:
"" (Empty String)
Callbacks Page
Section titled “Callbacks Page”op('chat').par.Callbackdat
DAT - Default:
ChatTD_callbacks
op('chat').par.Editcallbacksscript
Pulse - Default:
None
op('chat').par.Createpulse
Pulse - Default:
None
op('chat').par.Ontaskstart
Toggle - Default:
Off
op('chat').par.Ontaskcomplete
Toggle - Default:
Off
About Page
Section titled “About Page”op('chat').par.Showbuiltin
Toggle - Default:
Off
op('chat').par.Bypass
Toggle - Default:
Off
op('chat').par.Chattd
OP - Default:
/dot_lops/ChatTD
op('chat').par.Version
String - Default:
1.0.0
op('chat').par.Lastupdated
String - Default:
2024-11-06
op('chat').par.Creator
String - Default:
dotsimulate
op('chat').par.Website
String - Default:
https://dotsimulate.com
Callbacks
Section titled “Callbacks”onTaskStart
onTaskComplete
def onTaskStart(info):
# Called when a call to the assistant or user begins
# info dictionary contains details like op, callType
pass
def onTaskComplete(info):
# Called when the AI response is received and processed
# info dictionary contains details like op, result, conversationID
pass
Performance Considerations
Section titled “Performance Considerations”- Long conversations (many message blocks) can increase processing time.
- Higher temperature settings might slightly increase AI response time.
- Ensure the linked ChatTD operator is configured correctly.
Usage Examples
Section titled “Usage Examples”Few-Shot Prompting for an Agent
Section titled “Few-Shot Prompting for an Agent”The Chat LOP is ideal for creating few-shot prompts, which provide the AI with examples to guide its responses. This is a powerful way to enforce a specific output format or persona.
-
Add Message Blocks: On the Messages page, use the
+
button on theMessage
sequence parameter to add pairs of messages. -
Create Examples: For each pair, set up a
user
message and a correspondingassistant
response. This teaches the AI how you want it to behave.- Message 0 Role:
user
- Message 0 Text:
Translate 'hello' to French.
- Message 1 Role:
assistant
- Message 1 Text:
{'translation': 'bonjour'}
- Message 2 Role:
user
- Message 2 Text:
Translate 'goodbye' to Spanish.
- Message 2 Role:
assistant
- Message 2 Text:
{'translation': 'adios'}
- Message 0 Role:
-
Connect to Agent: Wire the output of this Chat LOP into the first input of an Agent LOP.
When the Agent receives a new prompt (e.g., Translate 'cat' to German.
), it will use the few-shot examples as context and is more likely to respond in the desired JSON format: {'translation': 'katze'}
.
Dynamic Conversation Control
Section titled “Dynamic Conversation Control”You can manage the conversation flow using the parameters on the Conversation page without writing any code.
- Clearing the Conversation: Pulse the
Clearconversation
parameter to reset the message blocks to a single, empty user message. - Loading from a DAT:
- Create a
Table DAT
withrole
andmessage
columns. - Connect it to the Chat LOP’s input.
- Pulse
Loadfrominput
to populate the message sequence from the DAT’s contents.
- Create a
- Input Handling: The
Inputhandling
menu controls how messages from an input DAT are combined with the messages in the parameter sequence. This is useful for combining a static set of examples with dynamic input.
Generating Content with Call Assistant
Section titled “Generating Content with Call Assistant”You can use the Chat LOP to generate content directly.
- Set up a sequence of messages that ends with a
user
role. - Pulse the
Callassistant
parameter on the Messages page. - The operator will call the AI model, and the response will be added as a new
assistant
message block, completing the conversation.