5 releases
Uses new Rust 2024
new 0.1.5 | May 14, 2025 |
---|---|
0.1.3 | May 8, 2025 |
0.1.2 | May 8, 2025 |
0.1.1 | May 7, 2025 |
0.1.0 | May 7, 2025 |
#488 in Web programming
518 downloads per month
37KB
538 lines
My ChatGPT API Rust Client
A Rust library for interacting with OpenAI's ChatGPT API with streaming support. This library provides a simple and efficient way to communicate with OpenAI's API while handling streaming responses and token usage tracking.
Features
- Stream or non-stream mode for API responses
- Conversation memory to maintain chat history
- Comprehensive error handling
- Token usage tracking
- Flexible output handling via callback functions
- Type-safe API interactions
- Async/await support
Version
Current version: 0.1.3
Requirements
- Rust edition 2024
- Dependencies:
- reqwest
0.12.15
(with json, stream, rustls-tls features) - serde
1.0
(with derive feature) - serde_json
1.0
- tokio
1.44.2
(with full features) - futures-util
0.3
- dotenv
0.15
- reqwest
Installation
Add this to your Cargo.toml
:
[dependencies]
my-chatgpt = { git = "https://github.com/bongkow/chatgpt-api", version = "0.1.3" }
Usage
use my_chatgpt::chat::{send_chat, ChatError, UsageInfo, ChatMessage};
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let api_key = "your-api-key";
let model = "gpt-4"; // or any other supported model
let instructions = "You are a helpful assistant.";
// Define a handler function for the API responses
let handler = |usage: Option<&UsageInfo>, error: Option<&ChatError>, raw_chunk: Option<&serde_json::Value>| {
if let Some(e) = error {
eprintln!("Error: {:?}", e);
}
if let Some(u) = usage {
println!("Input tokens: {}", u.input_tokens.unwrap_or(0));
println!("Output tokens: {}", u.output_tokens.unwrap_or(0));
println!("Total tokens: {}", u.total_tokens.unwrap_or(0));
}
// Process raw chunks if needed
if let Some(chunk) = raw_chunk {
// Do something with the raw chunk
}
};
// Initialize an empty chat history
let mut chat_history: Vec<ChatMessage> = Vec::new();
// First message
let input1 = "Tell me about Rust programming language.";
let response1 = send_chat(instructions, input1, api_key, model, true, handler, &mut chat_history).await?;
println!("First response: {}", response1);
// Follow-up question using chat history
let input2 = "What are its main advantages over C++?";
let response2 = send_chat(instructions, input2, api_key, model, true, handler, &mut chat_history).await?;
println!("Second response: {}", response2);
Ok(())
}
Error Handling
The library provides a ChatError
enum for different error cases:
pub enum ChatError {
RequestError(String), // Errors related to API requests
ParseError(String), // Errors in parsing responses
NetworkError(String), // Network-related errors
Unknown(String), // Other unexpected errors
}
Token Usage
Token usage information is provided via the UsageInfo
struct:
pub struct UsageInfo {
pub input_tokens: Option<u32>, // Number of tokens in the input
pub output_tokens: Option<u32>, // Number of tokens in the output
pub total_tokens: Option<u32>, // Total tokens used
}
Chat History
The library maintains conversation context through the ChatMessage
struct:
pub struct ChatMessage {
pub role: String, // The role of the message sender (e.g., "user", "assistant", "system")
pub content: String, // The content of the message
}
When you pass a chat history to send_chat
, the function automatically:
- Includes previous messages in the API request
- Updates the history with new messages
- Maintains context for more coherent multi-turn conversations
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT
Dependencies
~7–18MB
~238K SLoC