#openai-api #api-bindings

etke_openai_api_rust

(Temporary etke.cc fork) A very simple Rust library for OpenAI API, free from complex async operations and redundant dependencies

1 unstable release

0.1.9 Sep 12, 2024

#1166 in Web programming

Download history 114/week @ 2024-09-06 121/week @ 2024-09-13 48/week @ 2024-09-20 40/week @ 2024-09-27 15/week @ 2024-10-04 9/week @ 2024-10-11 4/week @ 2024-10-18 5/week @ 2024-10-25 15/week @ 2024-11-01 9/week @ 2024-11-08 19/week @ 2024-11-15 9/week @ 2024-11-22 15/week @ 2024-11-29

53 downloads per month
Used in baibot

MIT license

535KB
1K SLoC

OpenAI API for Rust

GitHub Workflow Status Crates.io Crates.io GitHub

NOTE: this is a temporary fork of openai_api_rust (https://github.com/openai-rs/openai-api) which improves the functionality of the image-generation API.

A community-maintained library provides a simple and convenient way to interact with the OpenAI API. No complex async and redundant dependencies.

API

check official API reference

API Support
Models ✔️
Completions ✔️
Chat ✔️
Images ✔️
Embeddings ✔️
Audio ✔️
Files
Fine-tunes
Moderations
Engines

Usage

Add the following to your Cargo.toml file:

openai_api_rust = "0.1.9"

Export your API key into the environment variables

export OPENAI_API_KEY=<your_api_key>

Then use the crate in your Rust code:

use openai_api_rust::*;
use openai_api_rust::chat::*;
use openai_api_rust::completions::*;

fn main() {
    // Load API key from environment OPENAI_API_KEY.
    // You can also hadcode through `Auth::new(<your_api_key>)`, but it is not recommended.
    let auth = Auth::from_env().unwrap();
    let openai = OpenAI::new(auth, "https://api.openai.com/v1/");
    let body = ChatBody {
        model: "gpt-3.5-turbo".to_string(),
        max_tokens: Some(7),
        temperature: Some(0_f32),
        top_p: Some(0_f32),
        n: Some(2),
        stream: Some(false),
        stop: None,
        presence_penalty: None,
        frequency_penalty: None,
        logit_bias: None,
        user: None,
        messages: vec![Message { role: Role::User, content: "Hello!".to_string() }],
    };
    let rs = openai.chat_completion_create(&body);
    let choice = rs.unwrap().choices;
    let message = &choice[0].message.as_ref().unwrap();
    assert!(message.content.contains("Hello"));
}

Use proxy

Load proxy from env

let openai = OpenAI::new(auth, "https://api.openai.com/v1/")
        .use_env_proxy();

Set the proxy manually

let openai = OpenAI::new(auth, "https://api.openai.com/v1/")
        .set_proxy("http://127.0.0.1:1080");

License

This library is distributed under the terms of the MIT license. See LICENSE for details.

Dependencies

~3–5MB
~87K SLoC