#ai #chat #llm #language-model

app llmvm-chat

An llmvm frontend that acts as a CLI chat interface

2 releases

0.1.1 Jan 23, 2024
0.1.0 Aug 9, 2023

#667 in Science

MPL-2.0 license

61KB
1K SLoC

llmvm-chat

Crates.io GitHub

An llmvm frontend that acts as a CLI chat interface.

Demo

asciicast

Installation

Install this application using cargo.

cargo install llmvm-chat

The llmvm core must be installed. If you have not done so, the core may be installed via

cargo install llmvm-core

A backend must be installed and configured. The llmvm-outsource is recommended for OpenAI requests. Currently, the default model preset is gpt-3.5-chat which uses this backend.

Usage

Run llmvm-chat to use the interface. Press CTRL-C when you are finished with your chat. A chat thread will be persisted, and the thread ID will be outputted.

Use the -h to see all options.

Use the -l to load the last chat thread.

Configuration

Run the chat executable to generate a configuration file at:

  • Linux: ~/.config/llmvm/chat.toml.
  • macOS: ~/Library/Application Support/com.djandries.llmvm/chat.toml
  • Windows: AppData\Roaming\djandries\llmvm\config\chat.toml
Key Required? Description
stdio_core No Stdio client configuration for communicated with llmvm core. See llmvm-protocol for details.
http_core No HTTP client configuration for communicating with llmvm core. See llmvm-protocol for details.

License

Mozilla Public License, version 2.0

Dependencies

~21–32MB
~588K SLoC