1 unstable release
| 0.1.23 | Dec 30, 2025 |
|---|
#770 in Hardware support
1MB
15K
SLoC
Mecha10 Behavior Patterns
Common behavior patterns as reusable BehaviorNode implementations for robotics and AI systems.
Overview
This package provides two fundamental behavior patterns that solve common robotics problems:
- Subsumption - Priority-based layered control for safety-critical systems
- Ensemble - Multi-model fusion with various voting strategies
Both patterns implement the BehaviorNode trait from mecha10-behavior-runtime, making them composable with other behaviors.
Installation
[dependencies]
mecha10-behavior-patterns = "0.1.0"
Patterns
Subsumption Architecture
The subsumption architecture organizes behaviors in priority layers. Higher priority behaviors can override lower priority ones. This is essential for safety-critical systems.
How It Works
- Layers are executed in order of priority (highest first)
- If a layer returns
SuccessorRunning, that status is returned immediately - If a layer returns
Failure, the next lower priority layer is tried - If all layers fail, the node returns
Failure
Example: Robot Navigation with Safety
use mecha10_behavior_patterns::prelude::*;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let ctx = Context::new("robot").await?;
// Create subsumption architecture
let mut subsumption = SubsumptionNode::new()
.add_named_layer(10, "emergency_stop", Box::new(EmergencyStopBehavior))
.add_named_layer(5, "avoid_obstacle", Box::new(AvoidObstacleBehavior))
.add_named_layer(1, "navigate", Box::new(NavigateBehavior));
// Execute
let mut executor = BehaviorExecutor::new(Box::new(subsumption), 30.0);
executor.run_until_complete(&ctx).await?;
Ok(())
}
Priority Levels
We recommend this priority scheme:
- 10: Emergency stop / safety critical
- 7-9: Reactive obstacle avoidance
- 4-6: Local navigation / path following
- 1-3: High-level planning / goal pursuit
Use Cases
- Safety systems: Emergency stop overrides all other behaviors
- Reactive control: Obstacle avoidance overrides navigation
- Hierarchical control: Multiple layers of abstraction
- Behavior arbitration: Multiple behaviors competing for control
Ensemble Learning
The ensemble pattern combines multiple AI models or behaviors to make more robust decisions. This is useful for redundancy, fault tolerance, and multi-modal sensor fusion.
Strategies
-
Conservative - All models must agree (high precision, low recall)
let ensemble = EnsembleNode::new(EnsembleStrategy::Conservative) .add_model(Box::new(YoloV8), 1.0) .add_model(Box::new(YoloV10), 1.0); -
Optimistic - Any model can succeed (high recall, low precision)
let ensemble = EnsembleNode::new(EnsembleStrategy::Optimistic) .add_model(Box::new(PreciseSensor), 1.0) .add_model(Box::new(FallbackSensor), 1.0); -
Majority - More than half must agree (balanced)
let ensemble = EnsembleNode::new(EnsembleStrategy::Majority) .add_model(Box::new(Model1), 1.0) .add_model(Box::new(Model2), 1.0) .add_model(Box::new(Model3), 1.0); -
WeightedVote - Weighted voting with threshold
let ensemble = EnsembleNode::new(EnsembleStrategy::WeightedVote) .with_threshold(0.6) // 60% threshold .add_model(Box::new(HighAccuracyModel), 0.5) .add_model(Box::new(FastModel), 0.3) .add_model(Box::new(BackupModel), 0.2);
Example: Multi-Model Object Detection
use mecha10_behavior_patterns::prelude::*;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let ctx = Context::new("detector").await?;
// Combine multiple YOLO models with weighted voting
let ensemble = EnsembleNode::new(EnsembleStrategy::WeightedVote)
.with_threshold(0.7)
.add_named_model("yolov8", Box::new(YoloV8Detector), 0.4)
.add_named_model("yolov10", Box::new(YoloV10Detector), 0.4)
.add_named_model("custom", Box::new(CustomDetector), 0.2);
// Execute
let mut executor = BehaviorExecutor::new(Box::new(ensemble), 10.0);
let (status, stats) = executor.run_until_complete(&ctx).await?;
println!("Detection complete: {} in {} ticks", status, stats.tick_count);
Ok(())
}
Use Cases
- Multi-model AI: Combine YOLO, custom models for robust detection
- Sensor fusion: Merge data from multiple sensors (LIDAR + camera)
- Fault tolerance: Redundant models for critical systems
- Confidence boosting: Require multiple models to agree
Combining Patterns
Subsumption and Ensemble can be combined for powerful control architectures:
// Safety layer uses ensemble of multiple safety checks
let safety_ensemble = EnsembleNode::new(EnsembleStrategy::Conservative)
.add_model(Box::new(BatteryCheck), 1.0)
.add_model(Box::new(TemperatureCheck), 1.0)
.add_model(Box::new(ObstacleCheck), 1.0);
// Navigation uses ensemble of path planners
let nav_ensemble = EnsembleNode::new(EnsembleStrategy::WeightedVote)
.add_model(Box::new(AStarPlanner), 0.6)
.add_model(Box::new(RRTPlanner), 0.4);
// Combine with subsumption
let control = SubsumptionNode::new()
.add_layer(10, Box::new(safety_ensemble))
.add_layer(1, Box::new(nav_ensemble));
JSON Configuration
Both patterns support JSON configuration (when used with the node registry):
Subsumption JSON
{
"type": "subsumption",
"layers": [
{
"priority": 10,
"name": "emergency_stop",
"node": "emergency_stop_behavior",
"config": { "max_speed": 0.0 }
},
{
"priority": 5,
"name": "avoid_obstacle",
"node": "obstacle_avoidance",
"config": { "safety_distance": 0.5 }
}
]
}
Ensemble JSON
{
"type": "ensemble",
"strategy": "weighted_vote",
"threshold": 0.6,
"models": [
{
"name": "yolov8",
"node": "yolo_v8_detector",
"weight": 0.4,
"config": { "confidence": 0.7 }
},
{
"name": "yolov10",
"node": "yolo_v10_detector",
"weight": 0.6,
"config": { "confidence": 0.7 }
}
]
}
Testing
# Run tests
cargo test -p mecha10-behavior-patterns
# Run clippy
cargo clippy -p mecha10-behavior-patterns -- -D warnings
Architecture
Both patterns are built on top of mecha10-behavior-runtime and implement the BehaviorNode trait:
#[async_trait]
pub trait BehaviorNode: Send + Sync + Debug {
async fn tick(&mut self, ctx: &Context) -> anyhow::Result<NodeStatus>;
// ... lifecycle methods ...
}
This makes them fully composable with:
- Core composition primitives (Sequence, Selector, Parallel)
- Other pattern nodes
- Custom behavior implementations
- AI inference nodes (Vision, Language, etc.)
Performance
- Subsumption: O(n) where n = number of layers
- Ensemble: O(m) where m = number of models (executed in parallel)
Both patterns are designed for real-time performance with minimal overhead.
See Also
- mecha10-behavior-runtime - Core behavior system
- AI Features Documentation - AI integration guide
- TODOS.md - Priority 7: AI Native features
License
MIT
Dependencies
~28–49MB
~659K SLoC