25 releases (9 stable)
new 1.7.1 | Dec 20, 2024 |
---|---|
1.6.0 | Nov 21, 2024 |
0.6.1 | Nov 3, 2024 |
#81 in Command line utilities
579 downloads per month
62KB
1.5K
SLoC
A command-line tool that uses AI to streamline your git workflow - from generating commit messages to explaining complex changes, all without requiring an API key.
Table of Contents
Features 🔅
- Smart Commit Messages: Generate conventional commit messages for your staged changes
- Git History Insights: Understand what changed in any commit, branch, or your current work
- Interactive Search: Find and explore commits using fuzzy search
- Change Analysis: Ask questions about specific changes and their impact
- Zero Config: Works instantly without an API key, using Phind by default
- Flexible: Works with any git workflow and supports multiple AI providers
- Rich Output: Markdown support for readable explanations and diffs (requires: mdcat)
Getting Started 🔅
Prerequisites
Before you begin, ensure you have:
git
installed on your system- fzf (optional) - Required for
lumen list
command - mdcat (optional) - Required for pretty output formatting
Installation
Using Homebrew (MacOS and Linux)
brew install jnsahaj/lumen/lumen
Using Cargo
[!IMPORTANT]
cargo
is a package manager forrust
, and is installed automatically when you installrust
. See installation guide
cargo install lumen
Usage 🔅
Generate Commit Messages
Create meaningful commit messages for your staged changes:
# Basic usage - generates a commit message based on staged changes
lumen draft
# Output: "feat(button.tsx): Update button color to blue"
# Add context for more meaningful messages
lumen draft --context "match brand guidelines"
# Output: "feat(button.tsx): Update button color to align with brand identity guidelines"
Explain Changes
Understand what changed and why:
# Explain current changes in your working directory
lumen explain --diff # All changes
lumen explain --diff --staged # Only staged changes
# Explain specific commits
lumen explain HEAD # Latest commit
lumen explain abc123f # Specific commit
lumen explain HEAD~3..HEAD # Last 3 commits
lumen explain main..feature/A # Branch comparison
lumen explain main...feature/A # Branch comparison (merge base)
# Ask specific questions about changes
lumen explain --diff --query "What's the performance impact of these changes?"
lumen explain HEAD --query "What are the potential side effects?"
Interactive Mode
# Launch interactive fuzzy finder to search through commits (requires: fzf)
lumen list
Tips & Tricks
# Copy commit message to clipboard
lumen draft | pbcopy # macOS
lumen draft | xclip -selection c # Linux
# View the commit message and copy it
lumen draft | tee >(pbcopy)
# Open in your favorite editor
lumen draft | code -
# Directly commit using the generated message
lumen draft | git commit -F -
AI Providers 🔅
Configure your preferred AI provider:
# Using CLI arguments
lumen -p openai -k "your-api-key" -m "gpt-4o" draft
# Using environment variables
export LUMEN_AI_PROVIDER="openai"
export LUMEN_API_KEY="your-api-key"
export LUMEN_AI_MODEL="gpt-4o"
Supported Providers
Provider | API Key Required | Models |
---|---|---|
Phind phind (Default) |
No | Phind-70B |
Groq groq |
Yes (free) | llama2-70b-4096 , mixtral-8x7b-32768 (default: mixtral-8x7b-32768 ) |
OpenAI openai |
Yes | gpt-4o , gpt-4o-mini , gpt-4 , gpt-3.5-turbo (default: gpt-4o-mini ) |
Claude claude |
Yes | see list (default: claude-3-5-sonnet-20241022 ) |
Ollama ollama |
No (local) | see list (required) |
OpenRouter openrouter |
Yes | see list (default: anthropic/claude-3.5-sonnet ) |
Advanced Configuration 🔅
Configuration File
Create a lumen.config.json
at your project root or specify a custom path with --config
:
{
"provider": "openai",
"model": "gpt-4o",
"api_key": "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"draft": {
"commit_types": {
"docs": "Documentation only changes",
"style": "Changes that do not affect the meaning of the code",
"refactor": "A code change that neither fixes a bug nor adds a feature",
"perf": "A code change that improves performance",
"test": "Adding missing tests or correcting existing tests",
"build": "Changes that affect the build system or external dependencies",
"ci": "Changes to our CI configuration files and scripts",
"chore": "Other changes that don't modify src or test files",
"revert": "Reverts a previous commit",
"feat": "A new feature",
"fix": "A bug fix"
}
}
}
Configuration Precedence
Options are applied in the following order (highest to lowest priority):
- CLI Flags
- Configuration File
- Environment Variables
- Default options
Example: Using different providers for different projects:
# Set global defaults in .zshrc/.bashrc
export LUMEN_AI_PROVIDER="openai"
export LUMEN_AI_MODEL="gpt-4o"
export LUMEN_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxx"
# Override per project using config file
{
"provider": "ollama",
"model": "llama3.2"
}
# Or override using CLI flags
lumen -p "ollama" -m "llama3.2" draft
Dependencies
~8–19MB
~272K SLoC