#bevy-plugin #bevy #ort #onnx #onnx-runtime

bin+lib bevy_ort

bevy ort (onnxruntime) plugin

23 releases (10 breaking)

0.12.8 May 6, 2024
0.11.1 May 5, 2024
0.8.1 Mar 16, 2024

#475 in Game dev


Used in 2 crates

MIT license

110KB
1.5K SLoC

bevy_ort 🪨

test GitHub License crates.io

a bevy plugin for the ort library

person mask

> modnet inference example

capabilities

  • load ONNX models as ORT session assets
  • initialize ORT with default execution providers
  • modnet bevy image <-> ort tensor IO (with feature modnet)
  • batched modnet preprocessing
  • compute task pool inference scheduling

models

  • lightglue (feature matching)
  • modnet (photographic portrait matting)
  • yolo_v8 (object detection)
  • flame (parametric head model)

library usage

use bevy::prelude::*;

use bevy_ort::{
    BevyOrtPlugin,
    models::flame::{
        FlameInput,
        FlameOutput,
        Flame,
        FlamePlugin,
    },
};


fn main() {
    App::new()
        .add_plugins((
            DefaultPlugins,
            BevyOrtPlugin,
            FlamePlugin,
        ))
        .add_systems(Startup, load_flame)
        .add_systems(Startup, setup)
        .add_systems(Update, on_flame_output)
        .run();
}


fn load_flame(
    asset_server: Res<AssetServer>,
    mut flame: ResMut<Flame>,
) {
    flame.onnx = asset_server.load("models/flame.onnx");
}


fn setup(
    mut commands: Commands,
) {
    commands.spawn(FlameInput::default());
    commands.spawn(Camera3dBundle::default());
}


#[derive(Debug, Component, Reflect)]
struct HandledFlameOutput;

fn on_flame_output(
    mut commands: Commands,
    flame_outputs: Query<
        (
            Entity,
            &FlameOutput,
        ),
        Without<HandledFlameOutput>,
    >,
) {
    for (entity, flame_output) in flame_outputs.iter() {
        commands.entity(entity)
            .insert(HandledFlameOutput);

        println!("{:?}", flame_output);
    }
}

run the example person segmentation model (modnet)

cargo run

use an accelerated execution provider:

  • windows - cargo run --features ort/cuda or cargo run --features ort/openvino
  • macos - cargo run --features ort/coreml
  • linux - cargo run --features ort/tensorrt or cargo run --features ort/openvino

see complete list of ort features here: https://github.com/pykeio/ort/blob/0aec4030a5f3470e4ee6c6f4e7e52d4e495ec27a/Cargo.toml#L54

note: if you use pip install onnxruntime, you may need to run ORT_STRATEGY=system cargo run, see: https://docs.rs/ort/latest/ort/#how-to-get-binaries

compatible bevy versions

bevy_ort bevy
0.1.0 0.13

credits

license

This software is dual-licensed under the MIT License and the GNU General Public License version 3 (GPL-3.0).

You may choose to use this software under the terms of the MIT License OR the GNU General Public License version 3 (GPL-3.0), except as stipulated below:

The use of the yolo_v8 feature within this software is specifically governed by the GNU General Public License version 3 (GPL-3.0). By using the yolo_v8 feature, you agree to comply with the terms and conditions of the GPL-3.0.

For more details on the licenses, please refer to the LICENSE.MIT and LICENSE.GPL-3.0 files included with this software.

Dependencies

~59–99MB
~1.5M SLoC