2 unstable releases
Uses new Rust 2024
| 0.2.0 | Oct 3, 2025 |
|---|---|
| 0.1.4 | Oct 3, 2025 |
| 0.1.2 |
|
| 0.1.1 |
|
| 0.1.0 |
|
#205 in Machine learning
21 downloads per month
280KB
6K
SLoC
FerricLink Core
Core abstractions for the FerricLink ecosystem, inspired by LangChain Core. This crate provides the fundamental building blocks for building AI applications with language models, tools, vector stores, and more.
Features
- Runnable System: Composable units of work that can be chained together
- Message Types: Human, AI, System, and Tool messages for conversation handling
- Language Models: Abstractions for LLMs and chat models
- Vector Stores: Embedding storage and similarity search
- Tools: Function calling and tool integration
- Callbacks: Monitoring and tracing system
- Documents: Text processing and retrieval
- Async-First: Built on Tokio for high-performance async operations
Quick Start
Add FerricLink Core to your Cargo.toml:
[dependencies]
ferriclink-core = { version = "0.1", features = ["all"] }
Example
use ferriclink_core::{
messages::AnyMessage,
language_models::{mock_chat_model, GenerationConfig},
runnables::Runnable,
};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create a mock chat model
let chat_model = mock_chat_model("gpt-4o-mini");
// Create a conversation
let messages = vec![
AnyMessage::human("Hello, how are you?"),
];
// Generate a response
let response = chat_model.generate_chat(
messages,
Some(GenerationConfig::new().with_temperature(0.7)),
None,
).await?;
println!("Response: {}", response.text());
Ok(())
}
Core Modules
Messages (messages)
Conversation handling with different message types:
HumanMessage- User inputAIMessage- AI responsesSystemMessage- System instructionsToolMessage- Tool outputsAnyMessage- Union type for all messages
Language Models (language_models)
Abstractions for language models:
BaseLanguageModel- Core language model traitBaseLLM- Text generation modelsBaseChatModel- Chat/conversation modelsGenerationConfig- Configuration for text generation
Runnables (runnables)
Composable execution system:
Runnable<Input, Output>- Core runnable traitRunnableConfig- Configuration for runsRunnableSequence- Chain multiple runnablesRunnableParallel- Run multiple runnables in parallel
Vector Stores (vectorstores)
Embedding storage and search:
VectorStore- Core vector store traitInMemoryVectorStore- In-memory implementationVectorSearchResult- Search results with similarity scores
Tools (tools)
Function calling system:
BaseTool- Core tool traitTool- Executable toolsToolCall- Tool invocationToolResult- Tool execution resultsToolCollection- Manage multiple tools
Callbacks (callbacks)
Monitoring and tracing:
CallbackHandler- Event handling traitConsoleCallbackHandler- Console outputMemoryCallbackHandler- In-memory storageCallbackManager- Manage multiple handlers
Documents (documents)
Text processing:
Document- Text with metadataDocumentCollection- Multiple documentsToDocument- Convert to documentsFromDocument- Convert from documents
Embeddings (embeddings)
Text embedding abstractions:
Embeddings- Core embedding traitEmbedding- Vector representationMockEmbeddings- Testing implementation
Retrievers (retrievers)
Document retrieval:
BaseRetriever- Core retriever traitVectorStoreRetriever- Vector-based retrievalMultiRetriever- Combine multiple retrievers
Development
This crate is part of the FerricLink workspace. See the main README for development instructions.
Prerequisites
- Rust 1.85.0+
- Tokio async runtime
Building
# Build the crate
cargo build
# Build in release mode
cargo build --release
# Run tests
cargo test
# Check code
cargo check
# Run clippy
cargo clippy
API Stability
This crate is currently in version 0.1.0 and is under active development. The API may change between minor versions until we reach 1.0.0.
License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Contributing
Contributions are welcome! Please see the main README for contribution guidelines.
Acknowledgments
Dependencies
~9–25MB
~308K SLoC