#model #egui #local #llm #huggingface #offline #prompt

bin+lib coze

An egui app for playing with a local open source LLM

7 releases

0.1.7 Mar 19, 2024
0.1.6 Mar 16, 2024
0.1.3 Feb 22, 2024

#126 in Caching

Download history 132/week @ 2024-02-09 220/week @ 2024-02-16 155/week @ 2024-02-23 155/week @ 2024-03-01 149/week @ 2024-03-08 296/week @ 2024-03-15 28/week @ 2024-03-22 52/week @ 2024-03-29 13/week @ 2024-04-05

104 downloads per month

Apache-2.0

595KB
2.5K SLoC

An egui app for prompting local offline LLMs.

Example prompt

Description

coze is an egui application for prompting local offline LLMs using the Huggingface candle crate.

Currently it supports the following quantized models:

The first time a model is used its weights are downloaded from Huggingface and cached to the ~/.cache/coze folder for later use.

The current version supports:

  • Prompt history navigation with fuzzy matching.
  • History persistence across runs.
  • Token generation modes.
  • Copy prompts and replies to clipboard.
  • Light/Dark mode.

See the app Help menu for usage details.

Installation

The latest version can be installed by getting the binaries for Linux, MacOS, and Windows from the releases page, or by using cargo:

cargo install --locked coze

To build locally (debug build may be very slow):

git clone https://github.com/vincev/coze
cd coze
cargo r --release

Dependencies

~31–54MB
~1M SLoC