EmbodiedAgents EmbodiedAgents EmbodiedAgents
  • Automatika Robotics
/

Get Started

  • Installation
  • Quick Start

Basic Concepts

  • Components
  • Clients
  • Models / Vector Databases

Foundation Recipes

  • Foundation Recipes Overview
  • Create a conversational agent with audio
  • Prompt engineering for LLMs/VLMs using vision models
  • Create a Spatio-Temporal Semantic Map
  • Create a Go-to-X component using map data
  • Use Tool Calling in Go-to-X
  • Create a Semantic Router to Route Information between Components
  • Bringing it all together 🤖

Embodied Planning and Control

  • Embodied Planning & Control Overview
  • Use a MultiModal Planning Model for Vision Guided Navigation
  • Controlling Robots with Vision Language Action Models
  • VLAs in More Sophisticated Agents

Events & Actions

  • Events & Actions Overview
  • Making the System Robust And Production Ready
  • Runtime Robustness: Model Fallback
  • Event-Driven Visual Description

Complete Reference

  • API Reference
    • agents
      • agents.clients
      • agents.components
      • agents.models
      • agents.ros
      • agents.vectordbs
  1. EmbodiedAgents /
  2. API Reference
View Source Open in ChatGPT Open in Claude

API ReferenceΒΆ

This page contains auto-generated API reference documentation [1].

  • agents
    • agents.clients
    • agents.components
    • agents.models
    • agents.ros
    • agents.vectordbs
[1]

Created with sphinx-autodoc2

Previous
Event-Driven Visual Description
Next
agents

2026, Automatika Robotics

Made with Sphinx and Shibuya theme.