Full documentation available here — Installation guides, configuration options, AI agent integration, and more.
A privacy-first, CLI-native way to semantically search your codebase.
Search code by what it does, not just what it's called. grepai indexes the meaning of your code using vector embeddings, enabling natural language queries that find conceptually related code—even when naming conventions vary.
grep was built in 1973 for exact text matching. Modern codebases need semantic understanding.
grep / ripgrep |
grepai |
|
|---|---|---|
| Search type | Exact text / regex | Semantic understanding |
| Query | "func.*Login" |
"user authentication flow" |
| Finds | Exact pattern matches | Conceptually related code |
| AI Agent context | Requires many searches | Fewer, more relevant results |
grepai is designed to provide high-quality context to AI coding assistants. By returning semantically relevant code chunks, your agents spend less time searching and more time coding.
curl -sSL https://raw.githubusercontent.com/yoanbernabeu/grepai/main/install.sh | shOr download from Releases.
grepai init # Initialize in your project
grepai watch # Start background indexing daemon
grepai search "error handling" # Search semantically
grepai trace callers "Login" # Find who calls a function| Command | Description |
|---|---|
grepai init |
Initialize grepai in current directory |
grepai watch |
Start real-time file watcher daemon |
grepai search <query> |
Search codebase with natural language |
grepai trace <cmd> |
Analyze call graph (callers/callees) |
grepai status |
Browse index state interactively |
grepai agent-setup |
Configure AI agents integration |
grepai update |
Update grepai to the latest version |
grepai search "authentication" -n 5 # Limit results (default: 10)
grepai search "authentication" --json # JSON output for AI agents
grepai search "authentication" --json -c # Compact JSON (~80% fewer tokens)Run the watcher as a background process:
grepai watch --background # Start in background
grepai watch --status # Check if running
grepai watch --stop # Stop gracefullyLogs are stored in OS-specific directories:
| Platform | Log Directory |
|---|---|
| Linux | ~/.local/state/grepai/logs/ |
| macOS | ~/Library/Logs/grepai/ |
| Windows | %LOCALAPPDATA%\grepai\logs\ |
Use --log-dir /custom/path to override (must be passed to all commands):
grepai watch --background --log-dir /custom/path # Start in background
grepai watch --status --log-dir /custom/path # Check if running
grepai watch --stop --log-dir /custom/path # Stop gracefullyKeep grepai up to date:
grepai update --check # Check for available updates
grepai update # Download and install latest version
grepai update --force # Force update even if already on latestThe update command:
- Fetches the latest release from GitHub
- Verifies checksum integrity
- Replaces the binary automatically
- Works on all supported platforms (Linux, macOS, Windows)
Find function relationships in your codebase:
grepai trace callers "Login" # Who calls Login?
grepai trace callees "HandleRequest" # What does HandleRequest call?
grepai trace graph "ProcessOrder" --depth 3 # Full call graphOutput as JSON for AI agents:
grepai trace callers "Login" --jsongrepai integrates natively with popular AI coding assistants. Run grepai agent-setup to auto-configure.
| Agent | Configuration File |
|---|---|
| Cursor | .cursorrules |
| Windsurf | .windsurfrules |
| Claude Code | CLAUDE.md / .claude/settings.md |
| Gemini CLI | GEMINI.md |
| OpenAI Codex | AGENTS.md |
grepai can run as an MCP (Model Context Protocol) server, making it available as a native tool for AI agents:
grepai mcp-serve # Start MCP server (stdio transport)Configure in your AI tool's MCP settings:
{
"mcpServers": {
"grepai": {
"command": "grepai",
"args": ["mcp-serve"]
}
}
}Available MCP tools:
grepai_search— Semantic code searchgrepai_trace_callers— Find function callersgrepai_trace_callees— Find function calleesgrepai_trace_graph— Build call graphgrepai_index_status— Check index health
For enhanced exploration capabilities in Claude Code, create a specialized subagent:
grepai agent-setup --with-subagentThis creates .claude/agents/deep-explore.md with:
- Semantic search via
grepai search - Call graph tracing via
grepai trace - Workflow guidance for code exploration
Claude Code automatically uses this agent for deep codebase exploration tasks.
Stored in .grepai/config.yaml:
embedder:
provider: ollama # ollama | lmstudio | openai
model: nomic-embed-text
endpoint: http://localhost:11434 # Custom endpoint (for Azure OpenAI, etc.)
dimensions: 768 # Vector dimensions (depends on model)
store:
backend: gob # gob | postgres
chunking:
size: 512
overlap: 50
search:
boost:
enabled: true # Structural boosting for better relevance
trace:
mode: fast # fast (regex) | precise (tree-sitter)
external_gitignore: "" # Path to external gitignore (e.g., ~/.config/git/ignore)Note: Old configs without
endpointordimensionsare automatically updated with sensible defaults.
grepai automatically adjusts search scores based on file paths. Patterns are language-agnostic:
| Category | Patterns | Factor |
|---|---|---|
| Tests | /tests/, /test/, __tests__, _test., .test., .spec. |
×0.5 |
| Mocks | /mocks/, /mock/, .mock. |
×0.4 |
| Fixtures | /fixtures/, /testdata/ |
×0.4 |
| Generated | /generated/, .generated., .gen. |
×0.4 |
| Docs | .md, /docs/ |
×0.6 |
| Source | /src/, /lib/, /app/ |
×1.1 |
Customize or disable in .grepai/config.yaml. See documentation for details.
Enable hybrid search to combine vector similarity with text matching:
search:
hybrid:
enabled: true
k: 60Uses Reciprocal Rank Fusion to merge results. Useful when queries contain exact identifiers.
Ollama (Default) — Privacy-first, runs locally:
ollama pull nomic-embed-textLM Studio — Local, OpenAI-compatible API:
# Start LM Studio and load an embedding model
# Default endpoint: http://127.0.0.1:1234OpenAI — Cloud-based:
export OPENAI_API_KEY=sk-...- GOB (Default): File-based, zero config
- PostgreSQL + pgvector: For large monorepos
- Ollama, LM Studio, or OpenAI API key (for embeddings)
- Go 1.22+ (only for building from source)
See CONTRIBUTING.md for guidelines.
MIT License - Yoan Bernabeu 2026