An AI-powered git commit message generator written in Rust.
- Automatically generates commit messages based on staged changes in your repository
- Follows the Conventional Commits format
- Supports multiple AI providers:
- Ollama (local inference)
- OpenAI API (GPT models)
- Gemini
- Customizable with different models and parameters
- Generate multiple message options
- Add specific instructions to guide message generation
- Integrates with lazygit for a seamless workflow
You can download and install pre-built binaries directly from GitHub Releases:
# Linux (amd64)
curl -L https://github.com/CJHwong/rs-git-msg/releases/latest/download/rs-git-msg-linux-amd64 -o /usr/local/bin/rs-git-msg && chmod +x /usr/local/bin/rs-git-msg
# macOS (amd64)
curl -L https://github.com/CJHwong/rs-git-msg/releases/latest/download/rs-git-msg-macos-amd64 -o /usr/local/bin/rs-git-msg && chmod +x /usr/local/bin/rs-git-msg
The easiest way to install rs-git-msg is using the provided install script:
# Clone the repository
git clone https://github.com/CJHwong/rs-git-msg.git
cd rs-git-msg
# Run the install script
./scripts/install.sh
The script will:
- Build the binary with optimizations
- Install it to an appropriate location in your PATH
- Set up necessary environment configurations if needed
-
Clone the repository
-
Build with Cargo:
cargo build --release
-
Move the built executable to a location in your PATH:
cp target/release/rs-git-msg ~/.local/bin/ # or sudo cp target/release/rs-git-msg /usr/local/bin/
To remove rs-git-msg from your system, you can use the uninstall script:
./scripts/uninstall.sh
This script will:
- Remove the rs-git-msg binary from standard installation locations
- Clean up any configuration files created during use
To manually uninstall, simply remove the binary from where you installed it:
# If installed to ~/.local/bin
rm ~/.local/bin/rs-git-msg
# Or if installed to /usr/local/bin
sudo rm /usr/local/bin/rs-git-msg
# Optionally remove config files
rm -rf ~/.config/rs-git-msg
Basic usage:
# Stage some changes first
git add .
# Generate a commit message
rs-git-msg
Usage: rs-git-msg [OPTIONS]
Options:
-n, --number <NUMBERS> Number of commit messages to generate (1-5) [default: 1]
-i, --instructions <INSTRUCTIONS>
Additional context or instructions for the AI
-v, --verbose Enable verbose output
-p, --provider <PROVIDER> AI provider to use [default: ollama] [possible values: ollama, openai, gemini]
-m, --model <MODEL> Model name to use [default: qwen2.5-coder]
-k, --api-key <API_KEY> API key for the provider (not needed for Ollama)
-u, --api-url <API_URL> API base URL (defaults to provider's standard URL)
-h, --help Print help
-V, --version Print version
# Using Ollama with a different model
rs-git-msg -m llama3
# Generate 3 message options
rs-git-msg -n 3
# Using OpenAI's GPT-3.5 Turbo
rs-git-msg -p openai -m gpt-3.5-turbo -k your_api_key_here
# Using Gemini
rs-git-msg -p gemini -m gemini-2.0-flash -k your_api_key_here
# Enable verbose output for debugging
rs-git-msg -v
You can integrate rs-git-msg with lazygit for an even smoother workflow:
-
Run the setup script:
./scripts/setup-lazygit.sh
-
In lazygit, you can now use the
ctrl + g
(orcmd + g
on macOS) key in the files view to generate a commit message automatically.
The command will:
- Generate a commit message using rs-git-msg
- Automatically populate the commit message field
- Use your configured AI provider and settings
If you prefer to manually configure lazygit without running the setup script:
-
Locate your lazygit config file:
- macOS:
~/Library/Application Support/lazygit/config.yml
- Linux/Others:
~/.config/lazygit/config.yml
- macOS:
-
Add the following configuration to your
config.yml
:customCommands: - key: <c-g> prompts: - type: input title: Additional Instructions (optional) key: Instructions initialValue: "" - type: menuFromCommand title: AI Commit Messages key: Msg command: 'rs-git-msg -n 5 {{if .Form.Instructions}}-i "{{.Form.Instructions}}"{{end}}' command: git commit -m "{{.Form.Msg}}" context: 'files' description: 'Generate commit message using rs-git-msg' loadingText: 'Generating commit messages...' stream: false
-
Save the file and lazygit should now have the new keybinding.
RS_GIT_MSG_API_KEY
: Set your API key for OpenAI or Gemini
- Install Ollama
- Pull the desired model:
ollama pull qwen2.5-coder
(or another model of your choice) - Run rs-git-msg (no API key needed)
- Create an account at OpenAI
- Generate an API key
- Run rs-git-msg with
-p openai -k your_api_key
To use the Gemini provider, you need to:
- Obtain an API key from Google AI Studio.
- Set the
RS_GIT_MSG_API_KEY
environment variable with your Gemini API key. - Use the
--provider gemini
flag when runningrs-git-msg
.
Example:
export RS_GIT_MSG_API_KEY="YOUR_GEMINI_API_KEY"
rs-git-msg --provider gemini
You can also specify the model and API URL if needed:
rs-git-msg --provider gemini --model gemini-2.0-flash --api-url https://generativelanguage.googleapis.com
This project uses cargo-husky
to manage Git hooks. When you clone the repository and run any Cargo command, the Git hooks will be automatically set up.
The pre-commit hook will:
- Format your code with
cargo fmt
- Run tests with
cargo test
- Run linting checks with
cargo clippy
- Verify the code compiles with
cargo check
If any of these steps fail, the commit will be prevented.
This project has a comprehensive test suite. Here's how to run and work with the tests:
To run all tests in the project:
cargo test
To run tests with output (including println statements):
cargo test -- --nocapture
To run a specific test:
cargo test test_name
To run tests in a specific module:
cargo test module_name
To check test coverage, you can use tools like cargo-tarpaulin:
# Install tarpaulin
cargo install cargo-tarpaulin
# Generate coverage report
cargo tarpaulin --out Html
For testing without a real AI provider, the project includes a MockProvider
implementation:
use crate::ai::mock::MockProvider;
#[test]
fn test_with_mock() {
// Create a mock provider with a predefined response
let mock_provider = MockProvider::new("feat(test): add new feature");
// Use the mock provider
// ...
// Check what prompts were sent to the mock
let calls = mock_provider.get_calls();
assert_eq!(calls[0], "expected prompt");
}
When adding new features, please follow these guidelines for tests:
- Unit Tests: Place them in the same file as the code they test, within a
mod tests
block - Mock External Services: Always use mocks for external services like API calls
- Test Edge Cases: Include tests for error conditions and edge cases
- Test Public API: Ensure all public functions and methods have tests
Contributions are welcome! Please feel free to submit a Pull Request.
When contributing, please:
- Add tests for any new features
- Ensure all tests pass with
cargo test
- Run
cargo fmt
for consistent code formatting - Run
cargo clippy
to catch common issues