2 releases
0.1.3 | Dec 31, 2024 |
---|---|
0.1.0 | Dec 31, 2024 |
#1478 in Web programming
220 downloads per month
14KB
229 lines
AI Assist
A Rust library for interacting with multiple AI providers including Google's Gemini API, OpenAI's GPT, and Anthropic's Claude. This library provides a simple and unified way to integrate AI capabilities into your Rust applications.
Features
- Support for multiple AI providers:
- Google Gemini
- OpenAI GPT
- Anthropic Claude
- Simple and unified API
- Async support
- Error handling with anyhow
- Type-safe request and response handling
- Configurable timeouts
Installation
Add this to your Cargo.toml
:
[dependencies]
aiassist = "0.1.0"
Usage
First, you'll need to get API keys from the providers you want to use. Then you can use the library like this:
use aiassist::{GeminiClient, OpenAIClient, ClaudeClient, ClientConfig, AIClient};
#[tokio::main]
async fn main() -> anyhow::Result<()> {
// Configure clients
let gemini_config = ClientConfig {
api_key: std::env::var("GEMINI_API_KEY")?,
timeout_secs: 30,
};
let openai_config = ClientConfig {
api_key: std::env::var("OPENAI_API_KEY")?,
timeout_secs: 30,
};
let claude_config = ClientConfig {
api_key: std::env::var("ANTHROPIC_API_KEY")?,
timeout_secs: 30,
};
// Initialize clients
let gemini = GeminiClient::new(gemini_config);
let openai = OpenAIClient::new(openai_config);
let claude = ClaudeClient::new(claude_config);
// Generate content using different providers
let prompt = "Tell me about Rust programming language";
let gemini_response = gemini.generate_content(prompt).await?;
println!("Gemini: {}", gemini_response);
let openai_response = openai.generate_content(prompt).await?;
println!("OpenAI: {}", openai_response);
let claude_response = claude.generate_content(prompt).await?;
println!("Claude: {}", claude_response);
Ok(())
}
Environment Variables
GEMINI_API_KEY
: Your Google Gemini API keyOPENAI_API_KEY
: Your OpenAI API keyANTHROPIC_API_KEY
: Your Anthropic API key
Provider-Specific Details
Google Gemini
- Uses the Gemini Pro model
- Requires a Google AI Studio API key
OpenAI
- Uses the GPT-3.5-turbo model by default
- Requires an OpenAI API key
- Supports chat completion format
Claude
- Uses Claude-2 model
- Requires an Anthropic API key
- Supports completion format
Error Handling
The library uses anyhow
for error handling. All errors are wrapped in anyhow::Result
, making it easy to handle different types of errors that might occur during API calls.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Dependencies
~7–19MB
~263K SLoC