2 releases
0.1.1 | Feb 9, 2025 |
---|---|
0.1.0 | Feb 8, 2025 |
#140 in HTTP client
251 downloads per month
120KB
280 lines
Table of Contents
About The Project
Rusty Ollama is a Rust client library for interacting with the Ollama API, providing both synchronous and streaming interfaces for working with large language models.
Features:
- Simple API for text generation
- Streaming responses for real-time processing
- Context management for conversation history
- Configurable request options
- Error handling for API interactions
Built With
Getting Started
Prerequisites
- Rust 1.60+
- Cargo
- Ollama server running locally (default: http://localhost:11434)
Installation
Add to your Cargo.toml
:
[dependencies]
rusty_ollama = { git = "0.1.1" }
Usage
Basic Generation
use rusty_ollama::{Ollama, OllamaError};
#[tokio::main]
async fn main() -> Result<(), OllamaError> {
let mut ollama = Ollama::create_default()?;
let response = ollama.generate("Why is the sky blue?").await?;
println!("Response: {}", response.response);
Ok(())
}
Streaming Responses
use rusty_ollama::{Ollama, OllamaError};
use futures::StreamExt;
#[tokio::main]
async fn main() -> Result<(), OllamaError> {
let mut ollama = Ollama::create_default()?;
let mut stream = ollama.stream_generate("Tell me a story about").await?;
while let Some(response) = stream.next().await {
match response {
Ok(chunk) => print!("{}", chunk.response),
Err(e) => eprintln!("Error: {}", e),
}
}
Ok(())
}
for more examples see hte examples
Roadmap
- Basic text generation
- Streaming responses
- Context management
- Async trait implementations
- Model management
- Advanced configuration options (Modelfile)
- Image processing
- Comprehensive documentation
- More error handling variants
See the open issues for full list of proposed features.
Contributing
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
Top contributors
License
Distributed under the MIT License. See LICENSE.txt
for more information.
Contact
lowpolycat1 - @acrylic_spark (discord)
Project Link: https://github.com/lowpolycat1/rusty_ollama
Acknowledgments
Dependencies
~7–19MB
~244K SLoC