6 releases (breaking)

0.6.0 Apr 30, 2024
0.5.0 Apr 14, 2024
0.4.0 Apr 13, 2024
0.3.0 Mar 31, 2024
0.1.0 Mar 28, 2024

#1158 in Command line utilities

Download history 385/week @ 2024-03-25 109/week @ 2024-04-01 245/week @ 2024-04-08 71/week @ 2024-04-15 12/week @ 2024-04-22 167/week @ 2024-04-29 15/week @ 2024-05-20

183 downloads per month

ISC and LGPL-3.0-or-later

740KB
778 lines

cai - The fastest CLI tool for prompting LLMs

Features

  • Build with Rust 🦀 for supreme performance and speed! 🏎️
  • Support for models by Groq, OpenAI, Anthropic, and local LLMs. 📚
  • Prompt several models at once. 🤼 Demo of cai's all command
  • Syntax highlighting for better readability of code snippets. 🌈

Demo

cai demo

Installation

cargo install cai

Usage

Before using Cai, an API key must be set up. Simply execute cai in your terminal and follow the instructions.

Cai supports the following APIs:

Afterwards, you can use cai to run prompts directly from the terminal:

cai List 10 fast CLI tools

Or a specific model, like Anthropic's Claude Opus:

cai op List 10 fast CLI tools

Full help output:

$ cai help
Cai 0.5.0

The fastest CLI tool for prompting LLMs

Usage: cai [PROMPT]...
       cai <COMMAND>

Commands:
  groq       [aliases: gr]
  mi         - Mixtral shortcut
  ll         - Llama 3 shortcut (🏆 Default)
  openai     OpenAI [aliases: op]
  gp         - GPT 4 shortcut
  gt         - GPT 4 Turbo shortcut
  anthropic  Anthropic [aliases: an]
  cl         - Claude Opus
  so         - Claude Sonnet
  ha         - Claude Haiku
  llamafile  Llamafile server hosted at http://localhost:8080 [aliases: lf]
  ollama     Ollama server hosted at http://localhost:11434 [aliases: ol]
  all        Send prompt to each provider's default model simultaneously
                 - Groq Llama3
                 - Antropic Claude Haiku
                 - OpenAI GPT 4 Turbo
                 - Ollama Phi3
                 - Llamafile
  help       Print this message or the help of the given subcommand(s)

Arguments:
  [PROMPT]...  The prompt to send to the AI model

Options:
  -h, --help  Print help


Examples:
  # Send a prompt to the default model
  cai Which year did the Titanic sink

  # Send a prompt to each provider's default model
  cai all Which year did the Titanic sink

  # Send a prompt to Anthropic's Claude Opus
  cai anthropic claude-opus Which year did the Titanic sink
  cai an claude-opus Which year did the Titanic sink
  cai cl Which year did the Titanic sink
  cai anthropic claude-3-opus-20240229 Which year did the Titanic sink

  # Send a prompt to locally running Ollama server
  cai ollama llama3 Which year did the Titanic sink
  cai ol ll Which year did the Titanic sink

  # Add data via stdin
  cat main.rs | cai Explain this code
  • AI CLI - Get answers for CLI commands from ChatGPT. (TypeScript)
  • AIChat - All-in-one chat and copilot CLI for 10+ AI platforms. (Rust)
  • ja - CLI / TUI app to work with AI tools. (Rust)
  • llm - Access large language models from the command-line. (Python)
  • smartcat - Integrate LLMs in the Unix command ecosystem. (Rust)
  • tgpt - AI chatbots for the terminal without needing API keys. (Go)

Dependencies

~24–39MB
~662K SLoC