1 unstable release
Uses new Rust 2024
new 0.1.1 | Mar 31, 2025 |
---|
#7 in #json-output
Used in 2 crates
270KB
2K
SLoC
Instructed Language Model At Coordinate
The instructed-language-model-at-coordinate
crate is a sophisticated Rust library designed for orchestrating precise instructions to Language Models (LMs) based on spatial agent coordinates in computational contexts. It facilitates precise, deterministic output from LMs by embedding structured guidance and query instruction sets.
Features
- Agent Coordinate System: Manage and associate coordinates for instructing language models.
- Instructional Clarity: Avoid vague instructions with
LanguageModelOutputFormatInstruction::AvoidVagueness
. - JSON Output: Enforce valid JSON outputs ensuring consistency and syntax correctness with
LanguageModelOutputFormatInstruction::ProvideOutputAsValidJson
. - Detailed JSON Objects: Create detailed, coordinate-specific queries via the
emit_detailed_json_objects
method.
Core Trait Implementations
GetSystemMessageAtAgentCoordinate
: Provides mechanisms to generate system messages based on agent coordinates conveying essential metadata and structured instructions.CreateLanguageModelQueryAtAgentCoordinate
: Formulates language model queries by integrating agent-specific instructions, producing queries ready for model consumption and interaction.
Example Usage
use instructed_language_model_at_coordinate::{InstructedLanguageModelAtCoordinate, LanguageModelOutputFormatInstruction};
let coord = AgentCoordinate::new();
let model = LanguageModelType::new();
let llm = InstructedLanguageModelAtCoordinate::emit_detailed_json_objects(&coord);
let request = llm.create_language_model_query_at_agent_coordinate(&model, &coord, &"Your query");
println!("{}", request);
Design Philosophy
The crate emphasizes precision in language modeling tasks, providing direct and comprehensive instructions to language models to ensure output reliability and accuracy. By converting complex instructions into structured JSON and allowing coordinate-specific modeling, it supports robust communication between computational systems and language interfaces.
This README.md file was generated by an AI model and may not be 100% accurate, however, it should be pretty good.
Dependencies
~26–42MB
~650K SLoC