bin+lib cogni

Unix native interface for LLMs

6 releases

0.2.1 Dec 1, 2023
0.2.0 Nov 30, 2023
0.1.3 Jul 26, 2023
0.1.1 Jun 28, 2023

#596 in Text processing

MIT/Apache

30KB
724 lines

cogni

Rust

Unix-minded interface for interacting with LLMs.

Focus

cogni brings language model scripting (prompting) into familiar Unix environment by focusing on:

  • Ergonomics and accessibility in Unix shell
  • Composability and interop with other programs - including cogni itself
  • Ease of language model programming in both ad-hoc and repeatable manner

For example, designing for IO redirection (stdin, stdout) allows cogni to work with files, editor buffers, clipboards, syslogs, sockets, and many external tools without bespoke integrations.

Features

  • Unix-minded Design (IO redirection, composability, interop)
  • Ad-hoc Language Model Scripting
  • Flexible input and output formats (Text, JSON, NDJSON)
  • Standalone binary - No Python required
  • 🚧 Repeatable Scripts via Templates
  • 🚧 Integration with external tools (Emacs, Raycast)

Non-Features

  • Interactive use - instead, invoke cogni from within interactive environments (REPLs, emacs, etc)

Installation

# Install from crates.io
$ cargo install cogni

# From source
$ cargo install --path .

Setup

cogni expects an OpenAI API Key to be supplied via --apikey option or more conveniently OPENAI_API_KEY environment variable:

# in shell configuration
export OPENAI_API_KEY=sk-DEADBEEF

Tour of cogni

🚧 WIP

Dependencies

~12–26MB
~373K SLoC