22 releases (6 stable)
new 1.5.0 | Nov 21, 2024 |
---|---|
0.6.1 | Nov 3, 2024 |
#30 in Command line utilities
2,043 downloads per month
61KB
1.5K
SLoC
lumen is a command-line tool that uses AI to generate commit messages, summarise git diffs or past commits, and more without requiring an API key.
Features 🔅
- Generate commit message for staged changes
- Generate summary for changes in a git commit by providing its SHA-1
- Generate summary for changes in git diff (staged/unstaged)
- Ask questions about a specific change
- Fuzzy-search for the commit to generate a summary
- Free and unlimited - no API key required to work out of the box
- Pretty output formatting enabled by Markdown
- Supports multiple AI providers
Usage 🔅
Try lumen --help
To generate a commit message for staged changes. See Advanced Configuration to configure commit types
lumen draft
The commit message can be piped to other commands
# copy the commit message to clipboard (macos and linux, respectively)
lumen draft | pbcopy
lumen draft | xclip -selection clipboard
# open the commit message in your code editor
lumen draft | code -
# directly commit with the generated message
lumen draft | git commit -F -
The AI generates more meaningful commit messages when you provide context.
lumen draft
# Output: "feat(button.tsx): Change button color to blue"
lumen draft --context "match brand guidelines"
# Output: "feat(button.tsx): Update button color to align with brand identity"
To summarise a commit, pass in its SHA-1
lumen explain HEAD
lumen explain cc50651f
To use the interactive fuzzy-finder (requires: fzf)
lumen list
To generate a summary for the current git diff
lumen explain --diff
lumen explain --diff --staged
You can ask a question about the diff (or a commit) using --query
lumen explain --diff --query "how will this change affect performance?"
lumen explain HEAD~2 --query "how can this be improved?"
AI Provider can be configured by using CLI arguments or Environment variables (see also: Advanced Configuration).
-p, --provider <PROVIDER> [env: LUMEN_AI_PROVIDER] [default: phind] [possible values: openai, phind, groq, claude, ollama, openrouter]
-k, --api-key <API_KEY> [env: LUMEN_API_KEY]
-m, --model <MODEL> [env: LUMEN_AI_MODEL]
# eg: lumen -p="openai" -k="<your-api-key>" -m="gpt-4o" explain HEAD
# eg: lumen -p="openai" -k="<your-api-key>" -m="gpt-4o" draft
# eg: LUMEN_AI_PROVIDER="openai" LUMEN_API_KEY="<your-api-key>" LUMEN_AI_MODEL="gpt-4o" lumen list
Supported providers
Provider | API Key Required | Models |
---|---|---|
Phind phind (Default) |
No | Phind-70B |
Groq groq |
Yes (free) | llama2-70b-4096 , mixtral-8x7b-32768 (default: mixtral-8x7b-32768 ) |
OpenAI openai |
Yes | gpt-4o , gpt-4o-mini , gpt-4 , gpt-3.5-turbo (default: gpt-4o-mini ) |
Claude claude |
Yes | see list (default: claude-3-5-sonnet-20241022 ) |
Ollama ollama |
No (local) | see list (required) |
OpenRouter openrouter |
Yes | see list (default: anthropic/claude-3.5-sonnet ) |
Installation 🔅
Using Homebrew (MacOS and Linux)
brew tap jnsahaj/lumen
brew install --formula lumen
Using Cargo
[!IMPORTANT]
cargo
is a package manager forrust
, and is installed automatically when you installrust
. see installation guide
cargo install lumen
Prerequisites 🔅
- git
- fzf (optional): Required for
lumen list
command - mdcat (optional): Required for pretty output formatting
Advanced Configuration 🔅
lumen
can be configured using a configuration file. You can specify lumen.config.json
at the root of your git-tracked project, or use --config "./path/to/config.json
{
"provider": "...",
"model": "...",
"api_key": "...",
}
You can also specify command-specific options (currently only draft.commit_types
is supported)
{
"draft": {
"commit_types": {
"<type>": "<description>"
}
}
}
An example configuration file can look like
{
"provider": "openai",
"model": "gpt-4o",
"api_key": "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"draft": {
"commit_types": {
"docs": "Documentation only changes",
"style": "Changes that do not affect the meaning of the code",
"refactor": "A code change that neither fixes a bug nor adds a feature",
"perf": "A code change that improves performance",
"test": "Adding missing tests or correcting existing tests",
"build": "Changes that affect the build system or external dependencies",
"ci": "Changes to our CI configuration files and scripts",
"chore": "Other changes that don't modify src or test files",
"revert": "Reverts a previous commit",
"feat": "A new feature",
"fix": "A bug fix"
}
}
}
lumen
uses the commit_types
specified in the example above as defaults if none are provided
Precedence Order
Since there are multiple ways of configuring options for lumen
, there is a defined precedence order
CLI Flags > Configuration File > Environment Variables > Default options
This also allows for mixing different methods.
For example, if you want to use Ollama
for a specific project, and OpenAI
for all others, you can do the following:
# .zshrc/.bashrc
export LUMEN_AI_PROVIDER="openai"
export LUMEN_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxx"
The above will apply globally. You can override this for a specific project by either specifying a configuration file or using CLI flags.
{
"provider": "ollama",
"model": "llama3.2"
}
OR
lumen -p "ollama" -m "llama3.2" draft
Dependencies
~8–20MB
~274K SLoC