#openai #ai #text-image #ai-api #chat-completion #api-bindings #api

rusty-openai

An unofficial OpenAI wrapper that supports image inputs

5 releases

new 0.1.10 Dec 8, 2024
0.1.9 Sep 22, 2024
0.1.8 Sep 21, 2024
0.1.5 Aug 8, 2024
0.1.0 Aug 7, 2024

#553 in Web programming

Download history 13/week @ 2024-08-11 122/week @ 2024-09-15 194/week @ 2024-09-22 32/week @ 2024-09-29

470 downloads per month

Custom license

79KB
1K SLoC

OpenAI Rust SDK

Crates.io docs.rs License

Welcome to the OpenAI Rust SDK, your all-in-one solution for integrating OpenAI's powerful capabilities into your Rust projects. This SDK provides a convenient abstraction over OpenAI's API, enabling you to easily perform tasks such as generating completions, creating and editing images, moderating text, fine-tuning models, and more.

Table of Contents

Installation

To use this SDK, add the following dependencies to your Cargo.toml file:

[dependencies]
rusty-openai = "0.1.10"
serde_json = "1.0"
tokio = { version = "1", features = ["full"] }
reqwest = { version = "0.12.5", features = ["json", "multipart"] }

Getting Started

To get started with the OpenAI Rust SDK, follow these steps:

Initialize OpenAI Client

First, create an instance of the OpenAI struct with your API key.

use rusty_openai::openai::OpenAI;

#[tokio::main]
async fn main() {
    let openai = OpenAI::new("YOUR_API_KEY", "https://api.openai.com/v1");
}

Generate Chat Completions

To generate chat completions, create a ChatCompletionRequest object and call the create method from the completions API. The SDK now supports structured outputs with JSON Schema validation:

use rusty_openai::openai::OpenAI;
use rusty_openai::openai_api::completion::ChatCompletionRequest;
use serde_json::json;
use std::env;

#[tokio::main]
async fn main() {
    let api_key = env::var("OPENAI_API_KEY").expect("API key not set");
    let openai = OpenAI::new(&api_key, "https://api.openai.com/v1");

    // Example with structured outputs using JSON Schema
    let schema = json!({
        "type": "object",
        "properties": {
            "steps": {
                "type": "array",
                "items": {
                    "type": "object",
                    "properties": {
                        "explanation": {"type": "string"},
                        "output": {"type": "string"}
                    },
                    "required": ["explanation", "output"]
                }
            },
            "final_answer": {"type": "string"}
        },
        "required": ["steps", "final_answer"]
    });

    let messages = vec![
        json!({
            "role": "user",
            "content": "Solve this equation: 2x + 5 = 13"
        })
    ];

    let request = ChatCompletionRequest::new_json_schema(
        "gpt-4o-2024-08-06".to_string(),
        messages,
        "math_reasoning".to_string(),
        schema
    )
    .temperature(0.7);

    let chat_response = openai.completions().create(request).await;

    match chat_response {
        Ok(chat) => println!("{}", json!(chat).to_string()),
        Err(err) => eprintln!("Error: {}", err),
    }
}

This simple example demonstrates how to generate chat completions using the SDK. For more detailed usage and additional endpoints, refer to the documentation.

Documentation

For detailed information on all the available endpoints and their respective methods, please refer to the full SDK Documentation.

License

This SDK is licensed under the MIT License. For more details, see the LICENSE file.

NOTE

Now please do be noted that this library is lacking of DETAILED documentations, as well as other missing endpoints from the official one. You may be asking why am I creating this library on Rust when there's already a repository and a library for it on Rust.

Why?

The current one does not support images and is lacking of functions and is not actively maintained.


Happy coding with OpenAI and Rust! If you encounter any issues or have questions, feel free to open an issue on the GitHub repository. Contributions and improvements are always welcome.


Dependencies

~7–18MB
~241K SLoC