#llm #vec #model #tool #computer #await #message #ic-llm

ic-llm

A library for making requests to the LLM canister on the Internet Computer

6 releases (2 stable)

1.0.1 Apr 22, 2025
0.4.0 Feb 24, 2025
0.3.0 Feb 20, 2025
0.2.0 Feb 19, 2025
0.1.0 Feb 17, 2025

#34 in #computer

Download history 51/week @ 2025-02-11 284/week @ 2025-02-18 54/week @ 2025-02-25 17/week @ 2025-03-04 19/week @ 2025-03-11 79/week @ 2025-03-18 115/week @ 2025-03-25 48/week @ 2025-04-01 87/week @ 2025-04-08 71/week @ 2025-04-15 416/week @ 2025-04-22 185/week @ 2025-04-29

765 downloads per month

Apache-2.0

30KB
573 lines

A library for making requests to the LLM canister on the Internet Computer.

Usage

Basic Usage

Prompting (Single Message)

The simplest way to interact with a model is by sending a single prompt:

use ic_llm::Model;

async fn example() -> String {
    ic_llm::prompt(Model::Llama3_1_8B, "What's the speed of light?").await
}

Chatting (Multiple Messages)

For more complex interactions, you can send multiple messages in a conversation:

use ic_llm::{Model, ChatMessage};

async fn example() {
    ic_llm::chat(Model::Llama3_1_8B)
        .with_messages(vec![
            ChatMessage::System {
                content: "You are a helpful assistant".to_string(),
            },
            ChatMessage::User {
                content: "How big is the sun?".to_string(),
            },
        ])
        .send()
        .await;
}

Advanced Usage with Tools

Defining and Using a Tool

You can define tools that the LLM can use to perform actions:

use ic_llm::{Model, ChatMessage, ParameterType};

async fn example() {
    ic_llm::chat(Model::Llama3_1_8B)
        .with_messages(vec![
            ChatMessage::System {
                content: "You are a helpful assistant".to_string(),
            },
            ChatMessage::User {
                content: "What's the balance of account abc123?".to_string(),
            },
        ])
        .with_tools(vec![
            ic_llm::tool("icp_account_balance")
                .with_description("Lookup the balance of an ICP account")
                .with_parameter(
                    ic_llm::parameter("account", ParameterType::String)
                        .with_description("The ICP account to look up")
                        .is_required()
                )
                .build()
        ])
        .send()
        .await;
}

Handling Tool Calls from the LLM

When the LLM decides to use one of your tools, you can handle the call:

use ic_llm::{Model, ChatMessage, ParameterType, Response};

async fn example() -> Response {
    let response = ic_llm::chat(Model::Llama3_1_8B)
        .with_messages(vec![
            ChatMessage::System {
                content: "You are a helpful assistant".to_string(),
            },
            ChatMessage::User {
                content: "What's the weather in San Francisco?".to_string(),
            },
        ])
        .with_tools(vec![
            ic_llm::tool("get_weather")
                .with_description("Get current weather for a location")
                .with_parameter(
                    ic_llm::parameter("location", ParameterType::String)
                        .with_description("The location to get weather for")
                        .is_required()
                )
                .build()
        ])
        .send()
        .await;
    
    // Process tool calls if any
    for tool_call in &response.message.tool_calls {
        match tool_call.function.name.as_str() {
            "get_weather" => {
                // Extract the location parameter
                let location = tool_call.function.get("location").unwrap();
                // Call your weather API or service
                let weather = get_weather(&location).await;
                // You would typically send this information back to the LLM in a follow-up message
            }
            _ => {} // Handle other tool calls
        }
    }
    
    response
}

// Mock function for getting weather
async fn get_weather(location: &str) -> String {
    format!("Weather in {}: Sunny, 72°F", location)
}

Dependencies

~1.4–8.5MB
~74K SLoC