#llm #chat-bot #framework #api-bindings

nightly llm-weaver

Manage long conversations with any LLM

21 releases

1.0.4 Nov 3, 2023
0.2.1 Sep 26, 2024
0.1.96 Jul 28, 2024
0.1.9 Mar 25, 2024
0.1.1 Nov 6, 2023

#1803 in Web programming

Download history 110/week @ 2024-08-09 15/week @ 2024-08-16 278/week @ 2024-09-13 312/week @ 2024-09-20 95/week @ 2024-09-27 15/week @ 2024-10-04 10/week @ 2024-10-11 1/week @ 2024-10-18

1,475 downloads per month

MIT license

77KB
1.5K SLoC

github github crates.io docs.rs build status

LLM Weaver

LLM Weaver is a flexible library designed to interact with any LLM, with an emphasis on managing long conversations exceeding the maximum token limit of a model, ensuring a continuous and coherent user experience.

Implementation

This library is a rust implementation of OpenAI's Tactic for handling long conversations with a token context bound LLM.

Once a certain threshold of context tokens is reached, the library will summarize the entire conversation and begin a new conversation with the summarized context appended to the system instructions.

Usage

Follow the crate level documentation for a detailed explanation of how to use the library.

Contribution

If you are passioniate about this project, please feel free to fork the repository and submit pull requests for enhancements, bug fixes, or additional features.

License

LLM Weaver is distributed under the MIT License, ensuring maximum freedom for using and sharing it in your projects.

Dependencies

~25–42MB
~531K SLoC