COMPLIANCE LAYER · ISO 13482 · EU AI ACT

Ship humanoid robots to regulated markets
In days, not months

Validate every robot decision before execution

Log every action with a cryptographic fingerprint

Export audit-ready reports your regulator will accept

THE PROBLEM

When something goes wrong, most teams can't answer one question

Robots are getting smarter — LLM agents, learned policies, autonomous decisions. But when something goes wrong, and it will, most teams can't answer: why did the robot do that?

Partenit was built to answer that question — before the incident, and after. Every decision signed, reproducible, explainable.

HOW IT WORKS

Three guarantees, one middleware

Drop Partenit between your robot's AI planner and its motors. Every action becomes validated, logged, and explainable — with zero changes to your existing code.

01 / VALIDATE

Check before execution

AgentGuard evaluates every action against your safety policies. Unsafe commands are clamped or blocked automatically, deterministically, in under 10ms.

02 / LOG

Sign every decision

Every action produces a signed DecisionPacket with risk score, applied policies, and SHA-256 fingerprint. Hash-chained and reproducible — forever.

03 / GRADE

Test before deploy

Run safety scenarios in simulation. Grade your controller A to F on collision rate, near-miss rate, and task completion. Find edge cases before they find your robot.

GET STARTED

One line of code, any platform

Same guard, same policies, same audit log — whether you run in pure Python, Isaac Sim, ROS 2, or intercept LLM tool calls. The only thing that changes is the adapter.

# No hardware needed — runs anywhere

from partenit.adapters import MockRobotAdapter
from partenit.agent_guard import GuardedRobot

adapter = MockRobotAdapter()
adapter.add_human("worker-1", x=1.2, y=0.0)

robot = GuardedRobot(adapter, policy_path="warehouse.yaml")
decision = robot.navigate_to(zone="shipping", speed=2.0)

# → speed clamped:  2.0 → 0.3
# → policy:         human_proximity_slowdown
# → risk score:     0.64
# → grade:          B
# Start h1_bridge.py in your Isaac Sim scene first
# examples/isaac_sim/run_bridge.sh

from partenit.adapters import IsaacSimAdapter
from partenit.agent_guard import GuardedRobot

adapter = IsaacSimAdapter(base_url="http://localhost:8000")

robot = GuardedRobot(adapter, policy_path="warehouse.yaml")
decision = robot.navigate_to(zone="shipping", speed=2.0)

# Same guard. Same policies. Same logs.
# → speed clamped:  2.0 → 0.3
# → risk score:     0.64
# → fingerprint:    a3f9c8...
# Humble / Iron / Jazzy — only the adapter changes

from partenit.adapters import ROS2Adapter
from partenit.agent_guard import GuardedRobot

adapter = ROS2Adapter(node_name="partenit_guard")

robot = GuardedRobot(adapter, policy_path="warehouse.yaml")
decision = robot.navigate_to(zone="shipping", speed=2.0)

# → allowed:         True
# → modified_params: {speed: 0.3}
# → policies fired:  ['human_proximity_slowdown']
# Intercept LLM tool calls before execution

from partenit.adapters import LLMToolCallGuard

guard = LLMToolCallGuard(policy_path="warehouse.yaml")

# LLM produces this tool call:
tool_call = {"action": "navigate_to", "speed": 2.5}
result = guard.check(tool_call, context={"human_distance": 1.1})

# → allowed:    False
# → reason:     emergency_stop (human at 1.1m)
# → suggested:  retry when distance > 1.5m
BUILT FOR

Three audiences, one platform

Partenit solves a different problem for each — with the same stack underneath.

ROBOT VENDOR

You ship hardware, your customer demands a platform

"Our robots work. But every enterprise customer asks for ISO certification, audit log, operator UI, HITL approval. That's 12 months of work we don't want to build."

Ship Partenit with every robot. Your customer deploys safely on day one — with a compliance bundle their regulator will accept.

Typical: Unitree, Figure, Noetix-class vendors
ENTERPRISE BUYER

Three robots, three vendors, nothing fits together

"Each robot has its own SDK, its own dashboard, its own compliance story. We can't scale this. Our safety team needs one source of truth."

One console for the whole fleet, whatever brand. One audit report for your regulator. One trained operator covers all vendors.

Typical: logistics, automotive, hospitality, healthcare
FLEET OPERATOR

You rent robots by the hour, integration eats your margin

"Every client deployment is a custom project. Three months of engineering before the robot earns a dollar. We can't scale past a handful of customers."

Plug the robot, pick a profession pack, hand over the console. A deployment is a week, not a quarter — and every hour is audited.

Typical: RaaS operators, integrators, leasing fleets
WHAT THIS UNLOCKS

Three shifts, measurable ones

12mo → 1wk

Time to regulated deployment

Safety rules, audit trail, operator console, HITL approval — already built, already tested. Your team doesn't rebuild what the regulators already expect.

1 → Any

Vendor lock-in removed

Unitree, Figure, Agility, Noetix — one adapter per vendor, one console above. Swap hardware without rewriting operations.

Human → Machine

Compliance made machine-checkable

Rules tied to ISO 13482, ISO 10218, EU AI Act, EU Machinery Directive. Every action checked in under 10ms, with the citation ready for audit.

PRICING

Open core, enterprise scale

The safety engine, policy DSL, adapters, and decision log are Apache 2.0 — free forever. Pay only for what scales with your fleet.

OPEN SOURCE

Free forever

Apache 2.0 · on GitHub
  • Policy DSL + evaluation engine
  • Basic risk scoring (distance, velocity, trust)
  • All adapters: Mock, ROS 2, Isaac Sim, Unitree, Gazebo, HTTP, LLM
  • Safety bench + built-in scenarios
  • Decision log with SHA-256 fingerprint verification
  • Analyzer web UI
  • GitHub Action for CI integration
View on GitHub

Talk to the team

30 minutes. We walk you through your deployment, show the live platform, and leave you with a compliance bundle your regulator will recognize.

Or read the technical documentation