bin+lib coze

An egui app for playing with a local open source LLM

7 releases

0.1.7 Mar 19, 2024
0.1.6 Mar 16, 2024
0.1.3 Feb 22, 2024

Apache-2.0

595KB
2.5K SLoC

An egui app for prompting local offline LLMs.

Example prompt

Description

coze is an egui application for prompting local offline LLMs using the Huggingface candle crate.

Currently it supports the following quantized models:

The first time a model is used its weights are downloaded from Huggingface and cached to the ~/.cache/coze folder for later use.

The current version supports:

  • Prompt history navigation with fuzzy matching.
  • History persistence across runs.
  • Token generation modes.
  • Copy prompts and replies to clipboard.
  • Light/Dark mode.

See the app Help menu for usage details.

Installation

The latest version can be installed by getting the binaries for Linux, MacOS, and Windows from the releases page, or by using cargo:

cargo install --locked coze

To build locally (debug build may be very slow):

git clone https://github.com/vincev/coze
cd coze
cargo r --release

Dependencies

~32–55MB
~1M SLoC