1 unstable release
Uses new Rust 2024
| 0.1.0 | Sep 26, 2025 |
|---|
#1212 in Debugging
155KB
4K
SLoC
valve
OpenAI-compatible proxy with adaptive token budgeting, per-session dashboards, and optional metrics exporters.
Quick start
cargo install --path .
Create config.toml in ~/.config/valve/ (Linux/macOS) or %APPDATA%\valve\ (Windows):
[server]
host = "127.0.0.1"
port = 7532
[proxy]
default_provider = "openai"
[proxy.providers.openai]
kind = "open_ai"
base_url = "https://api.openai.com/v1"
api_key = "sk-your-key"
[adapters]
observers = ["telemetry"]
Run the proxy:
valve
The terminal UI starts automatically. Use Tab, arrow keys, or h/l to move between tabs; 1-4 pick a history window; +/- cycle windows; Up/Down or j/k navigate sessions; r resets history; q quits. Append --headless for log-only mode.
Telemetry exporters
Add an optional Prometheus scrape endpoint or OTLP exporter in config.toml:
[telemetry.prometheus]
enabled = true
listen = "127.0.0.1:9898"
[telemetry.otlp]
enabled = true
endpoint = "http://localhost:4317"
protocol = "grpc"
interval_secs = 60
Examples
cargo run --example agent_dialogue
cargo run --example agent_dialogue -- --headless
Benchmarks
cargo bench --bench throughput
License
Dual-licensed under MIT or Apache-2.0. See LICENSE-MIT and LICENSE-APACHE for details.
Dependencies
~38–62MB
~1M SLoC