Skip to content
  1. GUIDES

Providers and Models

ChatTD is the central model router in LOPs. It handles provider selection, API keys, and model defaults. Most operators route through ChatTD rather than calling providers directly.

  1. 1 Open Settings UI ChatTD tab

    Use the ChatTD action in the OP Create Dialog, or open the Settings UI directly.

  2. 2 Choose a provider API Server menu

    Pick a cloud provider, local server, or Custom endpoint.

  3. 3 Add your key if cloud

    Paste the API key into ChatTD. Local providers usually skip this.

  4. 4 Pick a model model menu

    Refresh the model list or type a model ID.

New Agents inherit ChatTD’s provider and model when you place them. You can always override a specific Agent on its own Model page.

At a glance
Cloud
OpenRouter, OpenAI, Anthropic, Groq, Gemini
Local
Ollama, LM Studio, or any localhost server
Custom endpoints
Any OpenAI-compatible /v1 server
Routing
LiteLLM
Keys
Stored locally by ChatTD
Per-Agent override
Agent Model page

Cloud providers require an API key. Click Get API Key in ChatTD to open that provider’s key page in your browser.

Local providers like Ollama and LM Studio run on localhost. See the Local Models guide for setup details.

Custom endpoints connect to any server that exposes OpenAI-compatible /v1 routes — proxies, local runtimes, or self-hosted models. Set a provider name, URL, and LiteLLM prefix in ChatTD’s Custom URL page, and the new entry appears in the API Server dropdown.

Different workflows need different model capabilities. Before wiring tools into an Agent, check whether your model supports what the workflow needs:

  • Text chat
  • Tool / function calling
  • Vision input
  • Audio input
  • JSON or structured output
  • Streaming

If a model returns raw function-looking text instead of structured tool calls, the issue is usually the model or provider’s tool-call support, not the Agent wiring. The Local Models guide covers this in detail for local setups.

ChatTD stores keys and provider settings locally. LOPs can also write trace logs when debugging provider calls — see Local Data and Files for details on what gets stored and where.