Skip to content

ChatTD

ChatTD serves as the primary engine within LOPs for facilitating interactions with various Large Language Models (LLMs) and managing conversational context. It integrates several key functionalities:

  • API Interaction: Handles connections and calls to different LLM APIs (OpenAI, Gemini, OpenRouter, Groq, Ollama, LM Studio, Custom) via the LiteLLM library.
  • Conversation Management: Stores and manages chat history, including user messages, assistant responses, and system prompts.
  • Core Utilities: Incorporates instances of TDAsyncIO for managing asynchronous operations and Python Manager for handling Python environments and package installations required by LOPs.
  • Customapicall Method: Provides a standardized Python method (ext.ChatExt.Customapicall(...)) used by many other LOPs (like Agent, Caption, Handoff, Image Gen) to execute LLM API calls through ChatTD. This centralizes API key management, model selection, and call execution.

Essentially, most LOPs that need to “talk” to an LLM will do so by calling the Customapicall method on a designated ChatTD operator. Users configure the primary API settings (keys, model choices) directly on ChatTD.

The Customapicall method is the primary way other operators or custom scripts interact with ChatTD to make LLM API calls. It handles parameter defaults, conversation formatting, image inclusion, and asynchronous execution.

Key Parameters:

  • message (str | List[Dict]): The user message or the full conversation history.
  • model, temperature, max_tokens, etc.: Optional overrides for model parameters.
  • jsonmode (bool): Instructs the model to return JSON (if supported).
  • tools (List[Dict]): Defines tools the model can use.
  • callback (str): Name of the function (must be capitalized) on the callbackOP to call upon completion or error.
  • callbackOP (OP): The operator containing the callback function.
  • streaming (bool): Whether to stream the response chunk by chunk.
  • image_path (str): Path to an image file on disk to include (multimodal models).
  • audio_path (str): Path to an audio file on disk to include (e.g., for Gemini).
  • additional_images (List[Dict]): List of pre-formatted image data to include.

Condensed Python Example:

# --- Example Extension Script ---
import json
class ChatTDExamplesEXT:
def __init__(self, ownerComp):
self.ownerComp = ownerComp
# Assume ChatTD is referenced by a parameter 'Chattd'
self.ChatTD = op(self.ownerComp.par.Chattd)
if not self.ChatTD:
print("ERROR: ChatTD operator not found!")
# --- Basic Call with Callback ---
def GetFact(self):
if not self.ChatTD: return
topic = self.ownerComp.par.Topic.eval() # Example parameter
self.ChatTD.Customapicall(
message=f"Tell me a short fact about {topic}.",
callback='ProcessFact', # Capitalized callback name
callbackOP=self.ownerComp # Where ProcessFact lives
)
def ProcessFact(self, response_content, callbackInfo=None):
# 'response_content' contains the LLM's text reply
# 'callbackInfo' (optional) has details like response_time, call_id
print(f"Fact Received: {response_content}")
self.ownerComp.par.Result = response_content # Update UI
# --- JSON Mode Example ---
def GetJokeJson(self):
if not self.ChatTD: return
subject = self.ownerComp.par.Subject.eval()
self.ChatTD.Customapicall(
message=f"Tell me a joke about {subject}. Format as JSON with 'setup' and 'punchline' keys.",
callback='ProcessJokeJson',
callbackOP=self.ownerComp,
jsonmode=True # Request JSON output
)
def ProcessJokeJson(self, response_content, callbackInfo=None):
try:
joke_data = json.loads(response_content)
print(f"Setup: {joke_data.get('setup')}")
print(f"Punchline: {joke_data.get('punchline')}")
except json.JSONDecodeError:
print(f"Error: Could not parse JSON response: {response_content}")
# --- Conversation History Example ---
def ContinueConversation(self):
if not self.ChatTD: return
# Example history (replace with actual retrieval logic)
history = [
{"role": "system", "content": "You are a helpful pirate."},
{"role": "user", "content": "What's the weather like?"},
{"role": "assistant", "content": "Arr, the seas be calm today, matey!"}
]
# Add the new user message
new_message = self.ownerComp.par.NewMessage.eval()
history.append({"role": "user", "content": new_message})
self.ChatTD.Customapicall(
message=history, # Pass the whole list
callback='ProcessFact', # Reuse callback
callbackOP=self.ownerComp,
temperature=0.5 # Override temperature
)
# --- End Example ---

This example illustrates basic calls, JSON mode, callbacks, and passing conversation history. Refer to the full ChatExt.py extension within ChatTD for the complete parameter list and implementation details.

Only parameters on the Config page are documented here, as they are the primary settings users need to adjust for basic operation.

API / Model Selection Header
Select API Server (Apiserver) op('chattd').par.Apiserver Menu
Default:
gemini
Options:
openrouter, openai, groq, gemini, ollama, lmstudio, custom
API Key (Apikey) op('chattd').par.Apikey Str
Default:
API KEY STORED
Find API Key [ via web ] (Openbrowserorai) op('chattd').par.Openbrowserorai Pulse
Default:
None
Custom url (Customurl) op('chattd').par.Customurl Str
Default:
http://127.0.0.1:1234
Connection (local) (Connected) op('chattd').par.Connected Momentary
Default:
None
Verbose Debug Logs (Verboselitellm) op('chattd').par.Verboselitellm Menu
Default:
OFF
Options:
DEBUG, INFO, WARNING, ERROR, CRITICAL, OFF
Select LLM Model + Additional Info Header
Model (Model) op('chattd').par.Model Menu
Default:
gemini-1.5-flash
Options:
gemini-1.0-pro-vision-latest, gemini-pro-vision, gemini-1.5-pro-latest, gemini-1.5-pro-001, gemini-1.5-pro-002, gemini-1.5-pro, gemini-1.5-flash-latest, gemini-1.5-flash-001, gemini-1.5-flash-001-tuning, gemini-1.5-flash, gemini-1.5-flash-002, gemini-1.5-flash-8b, gemini-1.5-flash-8b-001, gemini-1.5-flash-8b-latest, gemini-1.5-flash-8b-exp-0827, gemini-1.5-flash-8b-exp-0924, gemini-2.5-pro-exp-03-25, gemini-2.5-pro-preview-03-25, gemini-2.5-flash-preview-04-17, gemini-2.0-flash-exp, gemini-2.0-flash, gemini-2.0-flash-001, gemini-2.0-flash-exp-image-generation, gemini-2.0-flash-lite-001, gemini-2.0-flash-lite, gemini-2.0-flash-lite-preview-02-05, gemini-2.0-flash-lite-preview, gemini-2.0-pro-exp, gemini-2.0-pro-exp-02-05, gemini-exp-1206, gemini-2.0-flash-thinking-exp-01-21, gemini-2.0-flash-thinking-exp, gemini-2.0-flash-thinking-exp-1219, learnlm-1.5-pro-experimental, gemini-embedding-exp-03-07, gemini-embedding-exp, gemini-2.0-flash-live-001
Update Models (Updatemodels) op('chattd').par.Updatemodels Pulse
Default:
None
Model Info (Modelinfo) op('chattd').par.Modelinfo Str
Default:
0k - Unknown, Free / Free
Open Model Info (Openmodelinfo) op('chattd').par.Openmodelinfo Pulse
Default:
None
Model Search (Modelsearch) op('chattd').par.Modelsearch Str
Default:
gemini
Ollama Download (Ollamadownload) op('chattd').par.Ollamadownload Pulse
Default:
None
Install Python Packages [ openai + tiktoken ] Header
Install LOPs Requirements [ LiteLLM ] (Installlops) op('chattd').par.Installlops Pulse
Default:
None
Python Venv Base Folder (Pythonvenv) op('chattd').par.Pythonvenv Folder
Default:
D:/TD-tox/LOPS_tox/INSTALLvenv
LOPs Lib Import RESET (Openailibreset) op('chattd').par.Openailibreset Pulse
Default:
None
Update LiteLLM (Pipupdate) op('chattd').par.Pipupdate Pulse
Default:
None
Current API Key Rate Limit [ openrouter.ai only ] Header
RL Usage (Rlusage) op('chattd').par.Rlusage Str
Default:
0
RL Limit (Rllimit) op('chattd').par.Rllimit Str
Default:
0
RL Requests Per Interval (Rlrequestsperinterval) op('chattd').par.Rlrequestsperinterval Str
Default:
0
RL Interval (Rlinterval) op('chattd').par.Rlinterval Str
Default:
0
Update Rate Info (Updaterateinfo) op('chattd').par.Updaterateinfo Pulse
Default:
None
  • Agent - The primary LOP for complex, multi-turn interactions using models configured in ChatTD.
  • Python Manager - Manages Python environments and package installations for LOPs components. An instance is built into ChatTD.
  • TDAsyncIO - Handles asynchronous operations within TouchDesigner. An instance is built into ChatTD.
  • Key Manager - Securely stores and retrieves API keys used by ChatTD and other LOPs.