#language-model #llm #response #sdk #send-message #babel #bridge

babel-bridge

SDK for interacting with various Large Language Model (LLM) APIs

3 releases

new 0.2.2 Apr 28, 2024
0.2.1 Apr 27, 2024
0.2.0 Apr 26, 2024

#1242 in Web programming

Download history 378/week @ 2024-04-25

378 downloads per month

MIT license

22KB
263 lines

Babel Bridge

LLM API Adapter SDK for Rust

This is a Rust SDK for interacting with various Large Language Model (LLM) APIs, starting with the Anthropic API. It allows you to send messages and engage in conversations with language models.

Features

  • Send single messages to the Anthropic API and retrieve responses
  • Engage in multi-turn conversations with Anthropic's language models
  • Customize the model, max tokens, and temperature for each request
  • Handle API errors and parse responses using Rust structs

Supported APIs

Currently, this SDK only supports the Anthropic API. However, there are plans to add support for additional Language Model APIs in the future. Stay tuned for updates!

Installation

Add the following to your Cargo.toml file:

[dependencies]
babel-bridge = "x.x.x"

Usage

First, make sure you have an API key from Anthropic. Set the API key as an environment variable named ANTHROPIC_API_KEY.

Sending a Single Message

To send a single message to the Anthropic API and retrieve the response:

use babel_bridge::client::AnthropicClient;
use babel_bridge::models::Message;

#[tokio::main]
async fn main() {
    let api_key = std::env::var("ANTHROPIC_API_KEY").expect("ANTHROPIC_API_KEY must be set.");
    let client = AnthropicClient::new(api_key.to_string());

    let messages = vec![Message {
        role: "user".to_string(),
        content: "Hello, Claude!".to_string(),
    }];

    let response = client
        .request()
        .messages(messages)
        // optional, defaults to DEFAULT_MODEL (client.rs)
        .model("claude-3-haiku-20240307")
        // optional, defaults to DEFAULT_MAX_TOKENS (client.rs)
        .max_tokens(100)
        // optional, defaults to DEFAULT_TEMP (client.rs)
        .temperature(1.0)
        // optional, system prompt is not used if not provided
        .system_prompt("You are a haiku assistant.") // optional
        .send()
        .await
        .expect("Failed to send message");

    print!("Response: {}", response.first_message());
    println!("Response:\n{:?}", response);
}

Engaging in a Conversation

To start a conversation with Anthropic's language models:

use babel_bridge::client::AnthropicClient;

#[tokio::main]
async fn main() {
    let api_key = std::env::var("ANTHROPIC_API_KEY").expect("ANTHROPIC_API_KEY must be set.");
    let client = AnthropicClient::new(api_key.to_string());

    let conversation = client
        .chat("claude-3-haiku-20240307", 100, 1.0, None)
        .send("Hello, Claude!")
        .await
        .expect("Failed to send message");

    println!("Last response: {}", conversation.last_response());
    println!("Dialog:\n{:?}", conversation.dialog());

    let conversation = conversation
        .add("How are you doing?")
        .await
        .expect("Failed to send message");

    println!("Last response: {}", conversation.last_response());
    println!("Dialog:\n{:?}", conversation.dialog());
}

Documentation

For detailed documentation and more examples, please refer to the API documentation.

Contributing

Contributions are welcome! If you find any issues or have suggestions for improvements, please open an issue or submit a pull request.

Dependencies

~4–18MB
~237K SLoC