16 releases (5 breaking)
| 0.7.0 | Nov 30, 2025 |
|---|---|
| 0.6.0 | Nov 29, 2025 |
| 0.4.0 | Nov 17, 2025 |
| 0.3.6 | Nov 16, 2025 |
| 0.1.0 | Oct 21, 2025 |
#1716 in Command line utilities
Used in fluix
3MB
15K
SLoC
LLM Link
π A user-friendly LLM proxy service with built-in support for popular AI coding tools
LLM Link provides zero-configuration access to LLM providers through multiple API formats, with optimized built-in support for popular AI applications.
β¨ Key Features
- π― Application-Oriented: Built-in configurations for popular AI coding tools
- β‘ Zero Configuration: One-command startup for common use cases
- π Multi-Protocol: Simultaneous OpenAI, Ollama, and Anthropic API support
- π 9 LLM Providers: OpenAI, Anthropic, Zhipu, Aliyun, Volcengine, Tencent, Longcat, Moonshot, Ollama
- π‘ Dynamic Model Discovery: REST API to query all supported providers and models
- ** Hot-Reload Configuration**: Update API keys and switch providers without restart
- ** Production Ready**: Built with Rust for performance and reliability
π― Supported Applications
| Application | Protocol | Port | Authentication | Status |
|---|---|---|---|---|
| Codex CLI | OpenAI API | 8088 | Bearer Token | β Ready |
| Zed | Ollama API | 11434 | None | β Ready |
| Aider | OpenAI API | 8090 | Bearer Token | β Ready |
| OpenHands | OpenAI API | 8091 | Bearer Token | β Ready |
οΏ½ Full Application Documentation β
οΏ½ Quick Start
Installation
# Install from crates.io (Recommended)
cargo install llm-link
# Or via Homebrew (macOS)
brew tap lipish/llm-link && brew install llm-link
# Or via pip (macOS / Linux)
pip install pyllmlink
οΏ½ Complete Installation Guide β
Basic Usage
# For Codex CLI
./llm-link --app codex-cli --api-key "your-auth-token"
# For Zed
./llm-link --app zed
# For Aider (using open-source models)
./llm-link --app aider --provider zhipu --model glm-4.6 --api-key "your-zhipu-key"
# For OpenHands
./llm-link --app openhands --provider anthropic --model claude-3-5-sonnet --api-key "your-anthropic-key"
π Detailed Configuration Guide β
π Help & Information
# List all supported applications
./llm-link --list-apps
# Get detailed setup guide for specific application
./llm-link --app-info aider
# List available models for a provider
./llm-link --provider zhipu --list-models
π Protocol Mode
Use multiple protocols simultaneously for maximum flexibility:
./llm-link --protocols openai,ollama,anthropic --provider zhipu --model glm-4.6
οΏ½ Protocol Mode Documentation β
ποΈ Architecture
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β AI Tools β β LLM Link β β LLM Providers β
β β β β β β
β β’ Codex CLI βββββΆβ β’ Protocol βββββΆβ β’ OpenAI β
β β’ Zed IDE β β Conversion β β β’ Anthropic β
β β’ Aider β β β’ Format β β β’ Zhipu β
β β’ OpenHands β β Adaptation β β β’ Aliyun β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
π Architecture Documentation β
π§ Advanced Usage
Custom Configuration
# Custom port and host
./llm-link --app aider --provider zhipu --model glm-4.6 --port 8095 --host 0.0.0.0
# With authentication
./llm-link --app aider --provider zhipu --model glm-4.6 --auth-key "your-secret-token"
Environment Variables
# Provider API keys
export ZHIPU_API_KEY="your-zhipu-api-key"
export OPENAI_API_KEY="sk-xxx"
export ANTHROPIC_API_KEY="sk-ant-xxx"
# LLM Link authentication
export LLM_LINK_API_KEY="your-auth-token"
π Advanced Configuration β
π§ͺ Testing
# Test health endpoint
curl http://localhost:8090/health
# Test OpenAI API
curl -X POST http://localhost:8090/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your-token" \
-d '{"model": "glm-4.6", "messages": [{"role": "user", "content": "Hello!"}]}'
π Testing & Troubleshooting β
π Full Documentation
π Complete Documentation Site β
- Getting Started - Installation and basic setup
- Application Guides - Detailed integration for each tool
- Configuration - Advanced configuration options
- Architecture - System design and internals
- API Reference - REST API documentation
π€ Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
π License
This project is licensed under the MIT License - see the LICENSE file for details.
π Links
Dependencies
~21β41MB
~514K SLoC