#commit-message #git-commit #commit #git #ai

app lumen

lumen is a command-line tool that uses AI to generate commit messages, summarise git diffs or past commits, and more without requiring an API key

23 releases (7 stable)

new 1.6.0 Nov 21, 2024
0.6.1 Nov 3, 2024

#25 in Command line utilities

Download history 4/week @ 2024-09-18 5/week @ 2024-09-25 128/week @ 2024-10-23 1274/week @ 2024-10-30 506/week @ 2024-11-06 149/week @ 2024-11-13 343/week @ 2024-11-20

2,400 downloads per month

MIT license

61KB
1.5K SLoC

lumen

Crates.io Total Downloads GitHub License Crates.io Size

A command-line tool that uses AI to streamline your git workflow - from generating commit messages to explaining complex changes, all without requiring an API key.

demo

Table of Contents

Features 🔅

  • Smart Commit Messages: Generate conventional commit messages for your staged changes
  • Git History Insights: Understand what changed in any commit, branch, or your current work
  • Interactive Search: Find and explore commits using fuzzy search
  • Change Analysis: Ask questions about specific changes and their impact
  • Zero Config: Works instantly without an API key, using Phind by default
  • Flexible: Works with any git workflow and supports multiple AI providers
  • Rich Output: Markdown support for readable explanations and diffs (requires: mdcat)

Getting Started 🔅

Prerequisites

Before you begin, ensure you have:

  1. git installed on your system
  2. fzf (optional) - Required for lumen list command
  3. mdcat (optional) - Required for pretty output formatting

Installation

Using Homebrew (MacOS and Linux)

brew install jnsahaj/lumen/lumen

Using Cargo

[!IMPORTANT] cargo is a package manager for rust, and is installed automatically when you install rust. See installation guide

cargo install lumen

Usage 🔅

Generate Commit Messages

Create meaningful commit messages for your staged changes:

# Basic usage - generates a commit message based on staged changes
lumen draft
# Output: "feat(button.tsx): Update button color to blue"

# Add context for more meaningful messages
lumen draft --context "match brand guidelines"
# Output: "feat(button.tsx): Update button color to align with brand identity guidelines"

Explain Changes

Understand what changed and why:

# Explain current changes in your working directory
lumen explain --diff                  # All changes
lumen explain --diff --staged         # Only staged changes

# Explain specific commits
lumen explain HEAD                    # Latest commit
lumen explain abc123f                 # Specific commit
lumen explain HEAD~3..HEAD            # Last 3 commits
lumen explain main..feature/A         # Branch comparison

# Ask specific questions about changes
lumen explain --diff --query "What's the performance impact of these changes?"
lumen explain HEAD --query "What are the potential side effects?"

Interactive Mode

# Launch interactive fuzzy finder to search through commits (requires: fzf)
lumen list

Tips & Tricks

# Copy commit message to clipboard
lumen draft | pbcopy                  # macOS
lumen draft | xclip -selection c      # Linux

# Open in your favorite editor
lumen draft | code -      

# Directly commit using the generated message
lumen draft | git commit -F -           

AI Providers 🔅

Configure your preferred AI provider:

# Using CLI arguments
lumen -p openai -k "your-api-key" -m "gpt-4o" draft

# Using environment variables
export LUMEN_AI_PROVIDER="openai"
export LUMEN_API_KEY="your-api-key"
export LUMEN_AI_MODEL="gpt-4o"

Supported Providers

Provider API Key Required Models
Phind phind (Default) No Phind-70B
Groq groq Yes (free) llama2-70b-4096, mixtral-8x7b-32768 (default: mixtral-8x7b-32768)
OpenAI openai Yes gpt-4o, gpt-4o-mini, gpt-4, gpt-3.5-turbo (default: gpt-4o-mini)
Claude claude Yes see list (default: claude-3-5-sonnet-20241022)
Ollama ollama No (local) see list (required)
OpenRouter openrouter Yes see list (default: anthropic/claude-3.5-sonnet)

Advanced Configuration 🔅

Configuration File

Create a lumen.config.json at your project root or specify a custom path with --config:

{
  "provider": "openai",
  "model": "gpt-4o",
  "api_key": "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
  "draft": {
    "commit_types": {
      "docs": "Documentation only changes",
      "style": "Changes that do not affect the meaning of the code",
      "refactor": "A code change that neither fixes a bug nor adds a feature",
      "perf": "A code change that improves performance",
      "test": "Adding missing tests or correcting existing tests",
      "build": "Changes that affect the build system or external dependencies",
      "ci": "Changes to our CI configuration files and scripts",
      "chore": "Other changes that don't modify src or test files",
      "revert": "Reverts a previous commit",
      "feat": "A new feature",
      "fix": "A bug fix"
    }
  }
}

Configuration Precedence

Options are applied in the following order (highest to lowest priority):

  1. CLI Flags
  2. Configuration File
  3. Environment Variables
  4. Default options

Example: Using different providers for different projects:

# Set global defaults in .zshrc/.bashrc
export LUMEN_AI_PROVIDER="openai"
export LUMEN_AI_MODEL="gpt-4o"
export LUMEN_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxx"

# Override per project using config file
{
  "provider": "ollama",
  "model": "llama3.2"
}

# Or override using CLI flags
lumen -p "ollama" -m "llama3.2" draft

Dependencies

~8–20MB
~274K SLoC