Skip to content

ChatTD

ChatTD serves as the primary engine within LOPs for facilitating interactions with various Large Language Models (LLMs) and managing conversational context. It integrates several key functionalities:

  • API Interaction: Handles connections and calls to different LLM APIs (OpenAI, Gemini, OpenRouter, Groq, Ollama, LM Studio, Custom) via the LiteLLM library.
  • Conversation Management: Stores and manages chat history, including user messages, assistant responses, and system prompts.
  • Core Utilities: Incorporates instances of TDAsyncIO for managing asynchronous operations and Python Manager for handling Python environments and package installations required by LOPs.
  • Customapicall Method: Provides a standardized Python method (ext.ChatExt.Customapicall(...)) used by many other LOPs (like Agent, Caption, Handoff, Image Gen) to execute LLM API calls through ChatTD. This centralizes API key management, model selection, and call execution.

Essentially, most LOPs that need to “talk” to an LLM will do so by calling the Customapicall method on a designated ChatTD operator. Users configure the primary API settings (keys, model choices) directly on ChatTD.

The Customapicall method is the primary way other operators or custom scripts interact with ChatTD to make LLM API calls. It handles parameter defaults, conversation formatting, image inclusion, and asynchronous execution.

Key Parameters:

  • message (str | List[Dict]): The user message or the full conversation history.
  • model, temperature, max_tokens, etc.: Optional overrides for model parameters.
  • jsonmode (bool): Instructs the model to return JSON (if supported).
  • tools (List[Dict]): Defines tools the model can use.
  • callback (str): Name of the function (must be capitalized) on the callbackOP to call upon completion or error.
  • callbackOP (OP): The operator containing the callback function.
  • streaming (bool): Whether to stream the response chunk by chunk.
  • image_path (str): Path to an image file on disk to include (multimodal models).
  • audio_path (str): Path to an audio file on disk to include (e.g., for Gemini).
  • additional_images (List[Dict]): List of pre-formatted image data to include.

Condensed Python Example:

# --- Example Extension Script ---
import json
class ChatTDExamplesEXT:
def __init__(self, ownerComp):
self.ownerComp = ownerComp
# Assume ChatTD is referenced by a parameter 'Chattd'
self.ChatTD = op(self.ownerComp.par.Chattd)
if not self.ChatTD:
print("ERROR: ChatTD operator not found!")
# --- Basic Call with Callback ---
def GetFact(self):
if not self.ChatTD: return
topic = self.ownerComp.par.Topic.eval() # Example parameter
self.ChatTD.Customapicall(
message=f"Tell me a short fact about {topic}.",
callback='ProcessFact', # Capitalized callback name
callbackOP=self.ownerComp # Where ProcessFact lives
)
def ProcessFact(self, response_content, callbackInfo=None):
# 'response_content' contains the LLM's text reply
# 'callbackInfo' (optional) has details like response_time, call_id
print(f"Fact Received: {response_content}")
self.ownerComp.par.Result = response_content # Update UI
# --- JSON Mode Example ---
def GetJokeJson(self):
if not self.ChatTD: return
subject = self.ownerComp.par.Subject.eval()
self.ChatTD.Customapicall(
message=f"Tell me a joke about {subject}. Format as JSON with 'setup' and 'punchline' keys.",
callback='ProcessJokeJson',
callbackOP=self.ownerComp,
jsonmode=True # Request JSON output
)
def ProcessJokeJson(self, response_content, callbackInfo=None):
try:
joke_data = json.loads(response_content)
print(f"Setup: {joke_data.get('setup')}")
print(f"Punchline: {joke_data.get('punchline')}")
except json.JSONDecodeError:
print(f"Error: Could not parse JSON response: {response_content}")
# --- Conversation History Example ---
def ContinueConversation(self):
if not self.ChatTD: return
# Example history (replace with actual retrieval logic)
history = [
{"role": "system", "content": "You are a helpful pirate."},
{"role": "user", "content": "What's the weather like?"},
{"role": "assistant", "content": "Arr, the seas be calm today, matey!"}
]
# Add the new user message
new_message = self.ownerComp.par.NewMessage.eval()
history.append({"role": "user", "content": new_message})
self.ChatTD.Customapicall(
message=history, # Pass the whole list
callback='ProcessFact', # Reuse callback
callbackOP=self.ownerComp,
temperature=0.5 # Override temperature
)
# --- End Example ---

This example illustrates basic calls, JSON mode, callbacks, and passing conversation history. Refer to the full ChatExt.py extension within ChatTD for the complete parameter list and implementation details.

The ChatTD operator parameters are organized across multiple pages for different aspects of configuration and operation.

Python Venv Base Folder (Pythonvenv) op('chattd').par.Pythonvenv Folder
Default:
"" (Empty String)
API Key (Apikey) op('chattd').par.Apikey Str
Default:
"" (Empty String)
API Server (Apiserver) op('chattd').par.Apiserver Menu
Default:
"" (Empty String)
Options:
openrouter, openai, groq, ollama, gemini, lmstudio, custom
AI Model (Model) op('chattd').par.Model Menu
Default:
"" (Empty String)
Refresh Models (Refreshmodels) op('chattd').par.Refreshmodels Pulse
Default:
Off
Search Models (Search) op('chattd').par.Search Toggle
Default:
Off
Model Search (Modelsearch) op('chattd').par.Modelsearch Str
Default:
"" (Empty String)
Custom URL (Customurl) op('chattd').par.Customurl Str
Default:
"" (Empty String)
Install / Modify LOPs (Installlops) op('chattd').par.Installlops Pulse
Default:
Off
Unlink Python Venv (Unlinkpythonvenv) op('chattd').par.Unlinkpythonvenv Pulse
Default:
Off
Clear All Stored Keys (Clearkeys) op('chattd').par.Clearkeys Pulse
Default:
Off
Verbose LiteLLM (Verboselitellm) op('chattd').par.Verboselitellm Menu
Default:
OFF
Options:
OFF, INFO, DEBUG
Use System Message (Usesystemmessage) op('chattd').par.Usesystemmessage Toggle
Default:
Off
System Message (Systemmessage) op('chattd').par.Systemmessage Str
Default:
"" (Empty String)
Temperature (Temperature) op('chattd').par.Temperature Float
Default:
0
Max Tokens (Maxtokens) op('chattd').par.Maxtokens Int
Default:
0
Seed (Seed) op('chattd').par.Seed Int
Default:
0
Stop Phrase (Stopphrase) op('chattd').par.Stopphrase Str
Default:
"" (Empty String)
JSON Mode (Jsonmode) op('chattd').par.Jsonmode Toggle
Default:
Off
Set Penalty (Setpenalty) op('chattd').par.Setpenalty Toggle
Default:
Off
Frequency Penalty (Frequencypenalty) op('chattd').par.Frequencypenalty Float
Default:
0
Presence Penalty (Presencepenalty) op('chattd').par.Presencepenalty Float
Default:
0
Image Detail (Imagedetail) op('chattd').par.Imagedetail Menu
Default:
auto
Options:
auto, low, high
Callbacks Header
Callback DAT (Callbackdat) op('chattd').par.Callbackdat DAT
Default:
ChatTD_callbacks
Edit Callbacks (Editcallbacksscript) op('chattd').par.Editcallbacksscript Pulse
Default:
Off
Create Callbacks (Createpulse) op('chattd').par.Createpulse Pulse
Default:
Off
onError (Onerror) op('chattd').par.Onerror Toggle
Default:
On
onCustomGenerate (Oncustomgenerate) op('chattd').par.Oncustomgenerate Toggle
Default:
Off
onCustomDone (Oncustomdone) op('chattd').par.Oncustomdone Toggle
Default:
Off
Textport Debug Callbacks (Debugcallbacks) op('chattd').par.Debugcallbacks Menu
Default:
Full Details
Options:
None, Errors Only, Basic Info, Full Details
Bypass (Bypass) op('chattd').par.Bypass Toggle
Default:
Off
Show Built-in Parameters (Showbuiltin) op('chattd').par.Showbuiltin Toggle
Default:
Off
Version (Version) op('chattd').par.Version Str
Default:
"" (Empty String)
Last Updated (Lastupdated) op('chattd').par.Lastupdated Str
Default:
"" (Empty String)
Creator (Creator) op('chattd').par.Creator Str
Default:
"" (Empty String)
Website (Website) op('chattd').par.Website Str
Default:
"" (Empty String)
ChatTD Operator (Chattd) op('chattd').par.Chattd OP
Default:
"" (Empty String)

ChatTD provides several callback functions that can be enabled to handle different stages of API calls:

  • onCustomGenerate: Called when an API call is initiated. Receives information about the call including call ID, model used, request start time, and conversation data.
  • onCustomDone: Called when an API call completes successfully. Receives the final response and metadata.
  • onCustomError: Called when an API call encounters an error. Receives error information and context.

Enable callbacks using the toggle parameters in the Callbacks page:

  • onError - Enable error handling callbacks (enabled by default)
  • onCustomGenerate - Enable call initiation callbacks
  • onCustomDone - Enable call completion callbacks

Use the Textport Debug Callbacks parameter to control the level of debug information printed to the textport:

  • None: No debug output
  • Errors Only: Only print error information
  • Basic Info: Print basic call information (model, timing, errors)
  • Full Details: Print complete callback information

The callback functions are implemented in the Callback DAT (default: ChatTD_callbacks). You can edit these callbacks using the Edit Callbacks pulse parameter or create new ones with Create Callbacks.

  • Agent - The primary LOP for complex, multi-turn interactions using models configured in ChatTD.
  • Python Manager - Manages Python environments and package installations for LOPs components. An instance is built into ChatTD.
  • TDAsyncIO - Handles asynchronous operations within TouchDesigner. An instance is built into ChatTD.
  • Key Manager - Securely stores and retrieves API keys used by ChatTD and other LOPs.