14 releases

1.0.4 Nov 3, 2023
1.0.3 Oct 31, 2023
0.1.94 Apr 14, 2024
0.1.9 Mar 25, 2024
0.1.1 Nov 6, 2023

#313 in Machine learning

Download history 14/week @ 2024-01-13 6/week @ 2024-01-20 5/week @ 2024-01-27 6/week @ 2024-02-17 26/week @ 2024-02-24 118/week @ 2024-03-16 301/week @ 2024-03-23 32/week @ 2024-03-30 179/week @ 2024-04-06

630 downloads per month

MIT license

55KB
998 lines

github github crates.io docs.rs build status

LLM Weaver

LLM Weaver is a flexible library designed to interact with any LLM, with an emphasis on managing long conversations exceeding the maximum token limit of a model, ensuring a continuous and coherent user experience.

Implementation

This library is a rust implementation of OpenAI's Tactic for handling long conversations with a token context bound LLM.

Once a certain threshold of context tokens is reached, the library will summarize the entire conversation and begin a new conversation with the summarized context appended to the system instructions.

Usage

Follow the crate level documentation for a detailed explanation of how to use the library.

Use Cases

  • Text-Based RPGs: Crafting coherent and persistently evolving narratives and interactions in text-based role-playing games.

  • Customer Support Chatbots: Developing chatbots that remember past user interactions and provide personalized support.

  • Educational Virtual Tutors: Implementing AI tutors that remember student interactions and tailor assistance accordingly.

  • Healthcare Virtual Assistants: Creating healthcare assistants that provide follow-up advice and reminders based on past user health queries.

  • AI-Driven MMO NPC Interactions: Enhancing MMO experiences by enabling NPCs to have contextually relevant interactions with players based on past encounters.

Contribution

If you are passioniate about this project, please feel free to fork the repository and submit pull requests for enhancements, bug fixes, or additional features.

License

LLM Weaver is distributed under the MIT License, ensuring maximum freedom for using and sharing it in your projects.

Dependencies

~27–44MB
~709K SLoC