#llm #brain #concepts #layer #llm-brain

llm-brain

A memory layer with ConceptNet integration and LLM-friendly interfaces, rewritten in Rust using SurrealDB

1 unstable release

Uses new Rust 2024

new 0.0.0 Apr 27, 2025

#18 in #brain

Unlicense

87KB
879 lines

LLMBrain: Rust Memory Layer

LLMBrain is a Rust library providing a foundational memory layer inspired by neuroscience concepts, particularly suited for building Retrieval-Augmented Generation (RAG) systems and other AI applications requiring structured memory.

It focuses on core functionalities for storing, managing, and recalling information fragments, leveraging vector embeddings for semantic search.

Core Concepts

  • LLMBrain Struct: The main entry point for interacting with the memory system. It manages configuration, storage, embedding generation, and recall operations.
  • MemoryFragment: The basic unit of information stored, containing content (text), metadata (a flexible JSON value for properties, relationships, types, etc.), and an embedding vector.
  • Storage: Currently utilizes SurrealDB as the backend for persistent storage (enabled via feature flags).
  • Embeddings: Integrates with OpenAI (via OpenAiClient and feature flags) to generate text embeddings for semantic recall.
  • Recall: Provides the recall() method for retrieving relevant MemoryFragments based on semantic similarity to a query string.
  • Integrations: Includes a ConceptNetClient for optional knowledge enrichment (requires separate setup).

Installation

Add llm-brain as a dependency in your Cargo.toml:

[dependencies]
llm-brain = { path = "/path/to/llm-brain/crate" } # Or use version = "x.y.z" if published
# Enable features as needed, e.g., for SurrealDB storage and OpenAI embeddings:
# llm-brain = { path = "...", features = ["storage-surrealdb", "llm-openai"] }

Quick Start

use llm-brain::{LLMBrain, Result};
use serde_json::json;

#[tokio::main]
async fn main() -> Result<()> {
    // Ensure configuration is set up (e.g., ./config/default.toml)
    // pointing to a database and potentially LLM API keys.
    // See tests/common/mod.rs for setup examples.

    println!("Launching LLMBrain...");
    let llm-brain = LLMBrain::launch().await?;
    println!("LLMBrain launched successfully.");

    // Add a memory fragment
    let content = "Rust is a systems programming language focused on safety and performance.".to_string();
    let metadata = json!({
        "entity_name": "Rust_Language",
        "memory_type": "Semantic",
        "properties": {
            "type": "Programming Language",
            "paradigm": ["Systems", "Functional", "Concurrent"],
            "key_features": ["Ownership", "Borrowing", "Concurrency"]
        }
    });

    println!("Adding memory...");
    let memory_id = llm-brain.add_memory(content.clone(), metadata).await?;
    println!("Memory added with ID: {}", memory_id);

    // Recall relevant information
    let query = "Tell me about Rust safety features";
    println!("Recalling memories for query: '{}'...", query);
    let results = llm-brain.recall(query, 1).await?;

    if let Some((fragment, score)) = results.first() {
        println!("\nTop result (Score: {:.4}):", score);
        println!("Content: {}", fragment.content);
        println!("Metadata: {:#}", fragment.metadata);
    } else {
        println!("No relevant memories found.");
    }

    Ok(())
}

Development

  • Build: cargo build
  • Check: cargo check (or cargo check --tests)
  • Test: cargo test (use cargo test -- --ignored --nocapture to run all tests, requires setup like OPENAI_API_KEY)
  • Lint/Format: cargo fmt and cargo clippy

License

This library is licensed under the Unlicense License. See the root LICENSE file for details.

Dependencies

~108MB
~2M SLoC