1 unstable release
Uses new Rust 2024
new 0.1.0 | May 10, 2025 |
---|
#12 in #ai
116 downloads per month
18KB
218 lines
rsoai
rsoai
is a Rust client library for the OpenAI API, offering simple, structured, and multimodal (text + image) input support with robust error handling and JSON schema-based output deserialization.
✨ Features
- 🔤 Push user/assistant messages (text + image)
- 📄 Deserialize OpenAI responses into Rust structs via schema
- 🔁 Switch between free-form and structured output modes
- 🧠 Maintain full conversation history
- 🛡️ Strong error typing and clear handling
🛠 Installation
Add rsoai
to your Cargo.toml
:
cargo add rsoai --path path/to/rsoai
cargo add dotenv
Set your OpenAI API key in a .env
file:
OPENAI_API_KEY=your_openai_api_key_here
🚀 Quick Start
🔹 Text-Only Example
use rsoai::openai::{OpenAIChatClient, ChatClient};
fn main() {
let mut client = OpenAIChatClient::new("gpt-4o");
client.push_user_message("What is the capital of Japan?");
match client.run_text() {
Ok(reply) => println!("Assistant: {}", reply),
Err(e) => eprintln!("Error: {}", e),
}
}
🔹 Image + Text Input
let mut client = OpenAIChatClient::new("gpt-4o");
client.put_text("user", "What do you see in this image?");
client.put_image("user", "https://example.com/image.jpg");
let result = client.run_text();
🔹 Structured JSON Output
First, define a macro in your macro crate:
use proc_macro::TokenStream;
use quote::quote;
use syn::{parse_macro_input, ItemStruct};
/// Attribute macro to add common derives and serde settings.
#[proc_macro_attribute]
pub fn schema_struct(_attr: TokenStream, item: TokenStream) -> TokenStream {
let input = parse_macro_input!(item as ItemStruct);
let ident = &input.ident;
let generics = &input.generics;
let fields = &input.fields;
let vis = &input.vis;
let attrs = &input.attrs;
let expanded = quote! {
#[derive(Debug, serde::Serialize, serde::Deserialize, schemars::JsonSchema)]
#[serde(deny_unknown_fields)]
#(#attrs)*
#vis struct #ident #generics #fields
};
TokenStream::from(expanded)
}
Use it to deserialize OpenAI output into typed Rust structs:
#[schema_struct]
struct Ingredient {
name: String,
quantity: String,
}
#[schema_struct]
struct Recipe {
title: String,
ingredients: Vec<Ingredient>,
}
Then run the structured query:
client.set_structured_mode::<Recipe>("Recipe");
client.push_user_message("Give me a French toast recipe.");
let recipe: Recipe = client.run_structured().unwrap();
println!("{:#?}", recipe);
⚙ Mode Switching
Switch between modes during the session:
client.set_plain_text(); // Free-form text output
client.set_structured_mode::<T>("SchemaName"); // Structured JSON output
🕘 Conversation History
Track all interactions:
for msg in client.history() {
println!("{}: {:?}", msg.role, msg.content);
}
🧱 Error Handling
All errors are typed via ChatError
:
enum ChatError {
MissingApiKey,
RequestFailed,
ApiError,
ReadBodyFailed,
JsonParseFailed,
NoOutputText,
}
Use standard ?
or match
to handle errors cleanly.
📂 Project Layout
rsoai/
├── src/
│ ├── ioschema.rs # Schema, types, error handling
│ └── openai.rs # Client logic
project/
└── src/main.rs # Examples
✅ Requirements
- Rust 1.70+
- OpenAI API key
.env
file withOPENAI_API_KEY
📄 License
MIT License © 2025
See LICENSE for details.
Dependencies
~5–16MB
~204K SLoC