Pre-Alpha · Open Source · Apache 2.0

AI that commands
real robots.

APYROBO is the open-source AI orchestration layer for robotics. Connect any LLM to any robot — safely, reliably, without rewriting your ROS 2 stack.

120+
Tests in suite
4
Hardware adapters
Any LLM, any robot
Apache 2.0
Open source license
Why APYROBO
The AI orchestration layer for physical robots

APYROBO gives AI agents the runtime to act in the physical world. Capability discovery, skill orchestration, swarm coordination, and safety enforcement — one layer, any hardware, any LLM.

🛡️
Safe by construction
Speed clamping, collision zones, watchdog, and battery checks wrap every command. Agents can plan — they can’t bypass safety constraints.
🔌
Hardware-agnostic
Capability adapters abstract any robot behind a semantic API. The same agent code runs on a TurtleBot, a UR arm, or a Boston Dynamics Spot.
🧩
Composable skill graphs
DAG-based task plans with precondition and postcondition verification. Skills chain together — and retry gracefully when the physical world doesn’t cooperate.
🤖
Any LLM, any provider
Model-agnostic agent layer works with OpenAI, Anthropic, Ollama, or any model via LiteLLM. Swap your backend without touching robot code.
🐝
Swarm coordination built in
Multi-robot task splitting, proximity safety, failure reassignment, and deadlock detection — handled natively, not bolted on as an afterthought.
📊
Full observability
Structured JSON logging, Prometheus metrics, OpenTelemetry traces, and execution replay. Know exactly what every agent and robot did, and why.
Architecture
Everything your robot
needs to think and act.

APYROBO is the AI orchestration layer for robotics, built on ROS 2. It adds semantic capability discovery, LLM-driven planning, swarm coordination, and safety enforcement — while keeping the entire ROS 2 ecosystem intact underneath.

🧠
Foundation Models & Research
Vision-language models, policy networks, embodied AI datasets
Open X-EmbodimentHuggingFace RoboticsMulti-modal reasoning
🤖
AI Agents
LLM-powered agents that plan and decide. APYROBO is model-agnostic — use any LLM, cloud or local.
OpenAIAnthropicOllama / LocalCustom agents
APYROBO — AI Orchestration Layer
The semantic layer above ROS 2: capability discovery, AI agent runtime, skill graph engine, swarm coordination, sensor pipelines, and safety policies
Capability APISkill GraphSwarm CommsSensor PipelinesSafety PoliciesLLM-Agnostic
Open Source
🔧
ROS 2 — Middleware Foundation
The industry standard for robot communication. APYROBO builds on ROS 2 natively — consuming topics, calling action servers, and publishing through standard interfaces.
DDS MessagingNav2MoveItTF Tree
🔬
Simulation Environments
Develop and test in sim, deploy to real hardware — the same APYROBO API works throughout
GazeboIsaac SimWebotsMuJoCo
🦾
Physical Hardware
One framework across any robot, any vendor, any form factor — no vendor lock-in
Boston DynamicsUniversal RobotsUnitreeAgility RoboticsCustom
Execution Model
Physical systems fail.
APYROBO handles it.

Software is deterministic. Robots aren’t. APYROBO’s built-in execution engine treats uncertainty as a first-class concern — surfacing risk before it becomes failure, and recovering gracefully when it does.

01 / PRE-EXECUTION
Capability Contract
Before execution begins, the runtime surfaces a confidence estimate and known risk factors based on current environment state.
task: deliver_package
confidence: 92%
risk: crowded_hallway
02 / DURING
Streaming Verification
A live state machine tracks execution at every step. Deviations are surfaced immediately, not at task completion.
→ position_verified
→ object_detected
→ grasp_confirmed
→ delivered ✓
03 / ON FAILURE
Recovery Policy
Each failure mode maps to a defined recovery path — retry, reroute, escalate, or abort — determined by policy, not the calling agent.
grasp_fail → retry(2)
path_blocked → reroute
sensor_loss → abort
04 / ALWAYS
Safety Envelope
Hard constraints enforced at the infrastructure layer — independent of agent instructions. No agent can override them.
max_speed: 1.2m/s
collision_zone: OFF
human_proximity: 0.5m
Hardware Adapters
Any robot.
One API.

APYROBO works with any robot through capability adapters. Bring your own hardware — or write a new adapter in minutes.

mock://
MockAdapter
Unit testing and development — no hardware or ROS 2 required.
gazebo://
GazeboAdapter
Full physics simulation. Develop in sim, deploy to real hardware unchanged.
mqtt://
MQTTAdapter
IoT and remote robots over MQTT brokers. Connect anything with a network interface.
http://
HTTPAdapter
REST-based robot APIs. Wrap any HTTP endpoint as a first-class APYROBO capability.
See It In Action
Five minutes.
No ROS 2 required.

Install APYROBO, connect to a mock robot, and execute your first natural language task — all without a physical robot or a running ROS 2 stack. Real hardware when you’re ready.

quickstart.py
# pip install -e ".[dev]"
from apyrobo import Robot, Agent

# No ROS 2 needed for mock adapter
robot = Robot.discover("mock://turtlebot4")
caps = robot.capabilities()
# → ['navigate_to', 'rotate', 'pick_object', 'place_object']

# No API key needed for rule-based agent
agent = Agent(provider="rule")
result = agent.execute(
  "go to 3, 4 and pick up the object",
  robot
)
print(result.status) # → completed

Build smarter robots.
Open source.

APYROBO is free, open, and actively developed. Star the repo, join the community, and follow along.

© 2025 APYROBO
Apache 2.0
Built on ROS 2
LLM-Agnostic
Pre-Alpha