3 releases
0.0.3 | Apr 22, 2024 |
---|---|
0.0.2 | Apr 22, 2024 |
0.0.1 | Apr 22, 2024 |
#429 in Asynchronous
20KB
254 lines
Rust SDK for CONVA AI Copilots
This is the Rust Crate for using CONVA AI Co-pilots
Usage
Add this to your Cargo.toml:
[dependencies]
conva_ai = "0.0.3"
futures-util = "0.3"
tokio = { version = "1.37.0", features = ["full"] }
NOTE: These are the package versions when this package was deployed and do not represent the last supported package numbers. Please check the futures-util
and tokio
documentation for their latest package versions.
Examples
1. A simple example for generating response using CONVA Co-pilot
use conva_ai::base::{AsyncConvaAI, BaseClient};
use futures_util::stream::StreamExt;
#[tokio::main]
async fn main() {
const COPILOT_ID: &str = "your-copilot-id";
const COPILOT_VERSION: &str = "your-copilot-version";
const API_KEY: &str = "your-copilot-apikey";
let mut client: BaseClient = AsyncConvaAI::init(
&String::from(COPILOT_ID),
&String::from(COPILOT_VERSION),
&String::from(API_KEY)
);
let result = client.invoke_capability("how are you?".to_string(), false, "default".to_string()).await;
match result {
Ok(mut out) => {
while let Some(val) = &out.next().await {
match val {
Ok(val) => println!("Response {:?}", val),
Err(e) => println!("{:?}", e)
}
}
},
Err(e) => println!("{:?}", e)
}
()
}
2. How to clear history
CONVA AI client, by default keeps track of your conversation history and uses it as the context for responding intelligently
You can clear conversation history by executing the below code:
use conva_ai::base::{AsyncConvaAI, BaseClient};
use futures_util::stream::StreamExt;
#[tokio::main]
async fn main() {
const COPILOT_ID: &str = "your-copilot-id";
const COPILOT_VERSION: &str = "your-copilot-version";
const API_KEY: &str = "your-copilot-apikey";
let mut client: BaseClient = AsyncConvaAI::init(
&String::from(COPILOT_ID),
&String::from(COPILOT_VERSION),
&String::from(API_KEY)
);
client.clear_history();
()
}
In case you are buliding an application where you don't want to track conversation history, you can disable history tracking
client.use_history(false)
You can enable history by
client.use_history(True)
3. Debugging responses
Conva AI uses generative AI to give you the response to your query. In order for you to understand the reasoning behind the response. We also provide you with AI's reasoning
use conva_ai::base::{AsyncConvaAI, BaseClient};
use futures_util::stream::StreamExt;
#[tokio::main]
async fn main() {
const COPILOT_ID: &str = "your-copilot-id";
const COPILOT_VERSION: &str = "your-copilot-version";
const API_KEY: &str = "your-copilot-apikey";
let mut client: BaseClient = AsyncConvaAI::init(
&String::from(COPILOT_ID),
&String::from(COPILOT_VERSION),
&String::from(API_KEY)
);
let result = client.invoke_capability("how are you?".to_string(), false, "default".to_string()).await;
match result {
Ok(mut out) => {
while let Some(val) = &out.next().await {
match val {
Ok(val) => {
println!("{:?}", val.reason)
},
Err(e) => println!("{:?}", e)
}
}
},
Err(e) => println!("{:?}", e)
}
()
}
4. How to use capability groups
Capability Groups are used to control the list of Capabilities that Co pilot will have access. You can make use of the capability group while using the invoke_capability
method
let result = client.invoke_capability("how are you?".to_string(), false, "<CAPABILITY_GROUP>".to_string()).await;
match result {
Ok(mut out) => {
while let Some(val) = &out.next().await {
match val {
Ok(val) => {
println!("{:?}", val.reason)
},
Err(e) => println!("{:?}", e)
}
}
},
Err(e) => println!("{:?}", e)
}
()
Dependencies
~5–16MB
~223K SLoC