3 releases

0.20.0 Apr 16, 2024
0.20.0-experimental Apr 23, 2024
0.20.0-experiemental Apr 22, 2024

#525 in Web programming

Download history 237/week @ 2024-04-15 174/week @ 2024-04-22

411 downloads per month
Used in transprompt

MIT license

215KB
4K SLoC

async-openai-wasm

Async Rust library for OpenAI on WASM

Overview

async-openai-wasmis a FORK of async-openai that supports WASM targets by targeting wasm32-unknown-unknown. That means >99% of the codebase should be attributed to the original project. The synchronization with the original project is and will be done manually when async-openai releases a new version. Versions are kept in sync with async-openai releases, which means when async-openai releases x.y.z, async-openai-wasm also releases a x.y.z version.

async-openai-wasm is an unofficial Rust library for OpenAI.

  • It's based on OpenAI OpenAPI spec
  • Current features:
    • Assistants (Beta)
    • Audio
    • Chat
    • Completions (Legacy)
    • Embeddings
    • Files
    • Fine-Tuning
    • Images
    • Microsoft Azure OpenAI Service
    • Models
    • Moderations
    • WASM support
  • Support SSE streaming on available APIs
  • Ergonomic builder pattern for all request objects.

Note on Azure OpenAI Service (AOS): async-openai-wasm primarily implements OpenAI spec, and doesn't try to maintain parity with spec of AOS. Just like async-openai.

Differences from async-openai

++ WASM support

++ WASM examples

-- Tokio

-- Non-wasm examples: please refer to the original project async-openai.

-- Backoff retries: due to this issue. HELP WANTED

-- File saving: wasm32-unknown-unknown on browsers doesn't have access to filesystem.

Usage

The library reads API key from the environment variable OPENAI_API_KEY.

# On macOS/Linux
export OPENAI_API_KEY='sk-...'
# On Windows Powershell
$Env:OPENAI_API_KEY='sk-...'

Image Generation Example

use async_openai_wasm::{
    types::{CreateImageRequestArgs, ImageSize, ResponseFormat},
    Client,
};
use std::error::Error;

#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
    // create client, reads OPENAI_API_KEY environment variable for API key.
    let client = Client::new();

    let request = CreateImageRequestArgs::default()
        .prompt("cats on sofa and carpet in living room")
        .n(2)
        .response_format(ResponseFormat::Url)
        .size(ImageSize::S256x256)
        .user("async-openai-wasm")
        .build()?;

    let response = client.images().create(request).await?;

    // Download and save images to ./data directory.
    // Each url is downloaded and saved in dedicated Tokio task.
    // Directory is created if it doesn't exist.
    let paths = response.save("./data").await?;

    paths
        .iter()
        .for_each(|path| println!("Image file path: {}", path.display()));

    Ok(())
}

Scaled up for README, actual size 256x256

Contributing

This repo will only accept issues and PRs related to WASM support. For other issues and PRs, please visit the original project async-openai.

This project adheres to Rust Code of Conduct

Complimentary Crates

  • openai-func-enums provides procedural macros that make it easier to use this library with OpenAI API's tool calling feature. It also provides derive macros you can add to existing clap application subcommands for natural language use of command line tools. It also supports openai's parallel tool calls and allows you to choose between running multiple tool calls concurrently or own their own OS threads.

Why async-openai-wasm

Because I wanted to develop and release a crate that depends on the wasm feature in experiments branch of async-openai, but the pace of stabilizing the wasm feature is different from what I expected.

License

The additional modifications are licensed under MIT license. The original project is also licensed under MIT license.

Dependencies

~6–20MB
~328K SLoC