Summarize Operator
Overview
Section titled “Overview”The Summarize LOP leverages AI models to generate summaries of conversations, tables, or text. It supports various summary types (brief, detailed, bullet points, action items) and integrates with different AI API servers (OpenRouter, OpenAI, Groq, Ollama, LM Studio, Custom) for model selection. This operator is particularly useful for quickly condensing large amounts of information into digestible formats, aiding in decision-making and information retrieval. It uses a shared sidecar server for asynchronous tasks, improving performance and efficiency.
Requirements
Section titled “Requirements”- Sidecar Server: Must be running (shared with other LOPs like Florence).
- Python Packages: Required based on selected API server (e.g.,
openai
). - ChatTD Operator: Required and must be configured.
Input/Output
Section titled “Input/Output”Inputs
Section titled “Inputs”- Conversation (Table DAT): Requires
role
andmessage
columns. - Table (Table DAT): Generic table data.
- Text (Text DAT): Plain text content.
Outputs
Section titled “Outputs”- Summary (Text DAT): Contains the generated summary.
Parameters
Section titled “Parameters”Summary Page
Section titled “Summary Page”op('summarize').par.Customprompt
String - Default:
None
op('summarize').par.Autocall
Toggle - Default:
Off
op('summarize').par.Active
Toggle - Default:
Off
op('summarize').par.Call
Pulse - Default:
None
op('summarize').par.Jsonmode
Toggle - Default:
Off
Model Page
Section titled “Model Page”Understanding Model Selection
Operators utilizing LLMs (LOPs) offer flexible ways to configure the AI model used:
- ChatTD Model (Default): By default, LOPs inherit model settings (API Server and Model) from the central
ChatTD
component. You can configureChatTD
via the "Controls" section in the Operator Create Dialog or its parameter page. - Custom Model: Select this option in "Use Model From" to override the
ChatTD
settings and specify theAPI Server
andAI Model
directly within this operator. - Controller Model: Choose this to have the LOP inherit its
API Server
andAI Model
parameters from another operator (like a different Agent or any LOP with model parameters) specified in theController [ Model ]
parameter. This allows centralizing model control.
The Search toggle filters the AI Model
dropdown based on keywords entered in Model Search
. The Show Model Info toggle (if available) displays detailed information about the selected model directly in the operator's viewer, including cost and token limits.
Available LLM Models + Providers Resources
The following links point to API key pages or documentation for the supported providers. For a complete and up-to-date list, see the LiteLLM provider docs.
op('summarize').par.Maxtokens
Int The maximum number of tokens the model should generate.
- Default:
256
op('summarize').par.Temperature
Float Controls randomness in the response. Lower values are more deterministic.
- Default:
0
op('summarize').par.Modelcontroller
OP Operator providing model settings when 'Use Model From' is set to controller_model.
- Default:
None
op('summarize').par.Search
Toggle Enable dynamic model search based on a pattern.
- Default:
off
- Options:
- off, on
op('summarize').par.Modelsearch
Str Pattern to filter models when Search is enabled.
- Default:
"" (Empty String)
Callbacks Page
Section titled “Callbacks Page”op('summarize').par.Callbackdat
DAT - Default:
None
op('summarize').par.Editcallbacksscript
Pulse - Default:
None
op('summarize').par.Createpulse
Pulse - Default:
None
op('summarize').par.Onsummarycomplete
Toggle - Default:
On
About Page
Section titled “About Page”op('summarize').par.Showbuiltin
Toggle - Default:
Off
op('summarize').par.Version
String - Default:
1.0.0
op('summarize').par.Lastupdated
String - Default:
2024-11-06
op('summarize').par.Chattd
OP - Default:
/dot_lops/ChatTD
op('summarize').par.Creator
String - Default:
dotsimulate
op('summarize').par.Website
String - Default:
https://dotsimulate.com
op('summarize').par.Bypass
Toggle - Default:
Off
Callbacks
Section titled “Callbacks”onSummaryComplete
def onSummaryComplete(info):
# Called when the summarization process finishes successfully
# info dictionary contains details like:
# - op: The Summarize operator
# - summary: The generated summary text
# - inputType: 'conversation', 'table', or 'text'
print(f"Summary generated: {info.get('summary')[:50]}...")
# Example: op('summary_display_text').text = info.get('summary')
pass
Performance Considerations
Section titled “Performance Considerations”- Relies on a shared sidecar server; ensure it’s running.
Max Tokens
affects summary length and processing time.- Model choice impacts speed and quality; experiment with different providers/models.
- Monitor sidecar server resources, especially with large inputs or concurrent tasks.
Usage Examples
Section titled “Usage Examples”Summarizing a Conversation
Section titled “Summarizing a Conversation”summarize_op = op('summarize1')conv_dat = op('conversation_log') # Table DAT with role, message cols
summarize_op.inputConnectors[0].connect(conv_dat)summarize_op.par.Inputtype = 'conversation'summarize_op.par.Summarytype = 'brief'summarize_op.par.Call.pulse()
# Summary result in summarize_op.op('summary_dat')
Using a Custom Prompt for Table Summary
Section titled “Using a Custom Prompt for Table Summary”summarize_op = op('summarize1')table_dat = op('sales_data')
summarize_op.inputConnectors[1].connect(table_dat) # Connect to Table inputs summarize_op.par.Inputtype = 'table'summarize_op.par.Summarytype = 'bullet points's summarize_op.par.Customprompt = "Summarize the key sales trends from this table."summarize_op.par.Call.pulse()
Common Use Cases
Section titled “Common Use Cases”- Condensing meeting transcripts.
- Extracting key info from data tables.
- Summarizing customer feedback.
- Creating executive summaries of reports.
- Automating news article summarization.