agents.components.semantic_router¶
Module Contents¶
Classes¶
Enum representing the operational modes of the SemanticRouter. Modes: |
|
A unified component that routes semantic information from input topics to output topics. |
API¶
- class agents.components.semantic_router.RouterMode(*args, **kwds)¶
Bases:
enum.EnumEnum representing the operational modes of the SemanticRouter. Modes:
LLM: Agentic mode using LLM for intent analysis and routing
VECTOR: Vector mode using embeddings and vector database for routing
- name()¶
The name of the Enum member.
- value()¶
The value of the Enum member.
- class agents.components.semantic_router.SemanticRouter(*, inputs: List[agents.ros.Topic], routes: List[agents.ros.Route], config: Optional[Union[agents.config.SemanticRouterConfig, agents.config.LLMConfig]] = None, db_client: Optional[agents.clients.db_base.DBClient] = None, model_client: Optional[agents.clients.model_base.ModelClient] = None, default_route: Optional[agents.ros.Route] = None, component_name: str, **kwargs)¶
Bases:
agents.components.llm.LLMA unified component that routes semantic information from input topics to output topics.
This component can operate in two modes:
Vector Mode (Standard): Uses a vector database to route inputs based on embedding similarity to route samples.
LLM Mode (Agentic): Uses an LLM to intelligently analyze intent and route inputs via function calling.
The mode is determined automatically based on the client provided (
db_clientvsmodel_client).- Parameters:
inputs (list[Topic]) – A list of input text topics that this component will subscribe to.
routes (list[Route]) – A list of pre-defined routes that publish incoming input to the routed output topics.
default_route (Optional[Route]) – An optional route that specifies the default behavior when no specific route matches. In Vector Mode, this is used based on distance threshold. In LLM Mode, this is used if the model fails to select a route.
config (Union[SemanticRouterConfig, LLMConfig]) – The configuration object. accepts
SemanticRouterConfig(for vector mode parameters) orLLMConfig(if specific LLM settings are needed). Defaults to SemanticRouterConfig.db_client (Optional[DBClient]) – (Vector Mode) A database client used to store and retrieve routing information.
model_client (Optional[ModelClient]) – (LLM Mode) A model client used for intelligent intent analysis and tool calling.
component_name (str) – The name of this Semantic Router component (default: “router_component”).
kwargs – Additional keyword arguments.
Example usage (Vector Mode):
# ... define topics and routes ... config = SemanticRouterConfig(router_name="my_vector_router") db_client = HTTPDBClient(db=ChromaDB(host='localhost', port=8080)) router = SemanticRouter( inputs=[input_text], routes=[route1, route2], db_client=db_client, config=config, component_name="router" )
Example usage (LLM Mode):
# ... define topics and routes ... model_client = OllamaClient(model_name="llama3", checkpoint=llama3.1:latest, init_params={"temperature": 0.0}) router = SemanticRouter( inputs=[input_text], routes=[route1, route2], model_client=model_client, component_name="smart_router" )
- custom_on_configure()¶
Create model client if provided and initialize model.
- custom_on_deactivate()¶
Deactivate component.
- abstractmethod set_component_prompt(template: Union[str, pathlib.Path]) None¶
LOCKED: The SemanticRouter does not use component prompts.
- abstractmethod set_topic_prompt(input_topic: agents.ros.Topic, template: Union[str, pathlib.Path]) None¶
LOCKED: Input topics are routed as-is.
- abstractmethod register_tool(tool, tool_description, send_tool_response_to_model=False) None¶
LOCKED: Tools are automatically generated from ‘Route’ objects.
- abstractmethod add_documents(ids, metadatas, documents) None¶
LOCKED: Document storage is managed via Route samples.
- set_system_prompt(prompt: str) None¶
Set system prompt for the model, which defines the models ‘personality’.
- Parameters:
prompt – string or a path to a file containing the string.
- Return type:
None
Example usage:
llm_component = LLM(inputs=[text0], outputs=[text1], model_client=model_client, config=config, component_name='llama_component') llm_component.set_system_prompt(prompt="You are an amazing and funny robot. You answer all questions with short and concise answers.")
- property additional_model_clients: Optional[Dict[str, agents.clients.model_base.ModelClient]]¶
Get the dictionary of additional model clients registered to this component.
- Returns:
A dictionary mapping client names (str) to ModelClient instances, or None if not set.
- Return type:
Optional[Dict[str, ModelClient]]
- change_model_client(model_client_name: str) bool¶
Hot-swap the active model client at runtime.
This method replaces the component’s current
model_clientwith one from the registeredadditional_model_clients. It handles the safe de-initialization of the old client and initialization of the new one.This is commonly used as a target for Actions in the Event system.
- Parameters:
model_client_name (str) – The key corresponding to the desired client in
additional_model_clients.- Returns:
True if the swap was successful, False otherwise (e.g., if the name was not found or initialization failed).
- Return type:
bool
- Example:
from agents.ros import Action # Define an action to switch to the 'local_backup' client defined previously switch_to_local = Action( method=brain.change_model_client, args=("local_backup",) ) # Trigger this action if the component fails (e.g. internet loss) brain.on_component_fail(action=switch_to_local, max_retries=3)
- property warmup: bool¶
Enable warmup of the model.
- custom_on_activate()¶
Custom configuration for creating triggers.
- create_all_subscribers()¶
Override to handle trigger topics and fixed inputs. Called by parent BaseComponent
- activate_all_triggers() None¶
Activates component triggers by attaching execution step to callbacks
- destroy_all_subscribers() None¶
Destroys all node subscribers