Skip to content

chamanbravo/fid

Repository files navigation

Fid


AI for the command line, built for pipelines.

Fid brings Large Language Models (LLMs) directly to your terminal. It can take command output, process it with AI, and return results in formats like Markdown, JSON, or plain text. Think of it as a lightweight way to make your command-line workflows smarter with just a touch of AI.

You can also use OpenAI, Cohere, Groq, or Azure OpenAI.

Installation

uv pip install fid

What Can It Do?

Fid reads from standard input and pairs it with a prompt you provide via command-line arguments. That input is then sent to an LLM, which generates an answer, formatted in Markdown.

You can configure models in your settings file by running fid --settings.

Examples

1. Summarize logs

cat /var/log/syslog | fid summarize the key errors

2. Shell commands

fid explain "ls -lh /etc"

3. Get one-liner shell fixes (using a custom role)

fid --role shell find all .py files modified in the last 24 hours

Usage

  • -m, --model: Specify Large Language Model to use
  • --role: Specify the role to use (See custom roles)
  • --list-roles: List the roles defined in configuration
  • --settings: Open settings
  • --dirs: Print directories where config is stored
  • --reset-settings: Restore settings to default
  • --version: Show version

Custom Roles

Roles allow you to set system prompts. Here is an example of a shell role:

roles:
  shell:
    - you are a shell expert
    - you do not explain anything
    - you simply output one liners to solve the problems you're asked
    - you do not provide any explanation whatsoever, ONLY the command

Then, use the custom role in fid:

fid --role shell list files in the current directory

Setup

Gemini

Fid uses gemini-2.0-flash by default.

Set the GOOGLE_API_KEY enviroment variable. If you don't have one yet, you can get it from the Google AI Studio.

Open AI

Set the OPENAI_API_KEY environment variable. If you don't have one yet, you can grab it the OpenAI website.

Alternatively, set the [AZURE_OPENAI_KEY] environment variable to use Azure OpenAI. Grab a key from Azure.

Cohere

Cohere provides enterprise optimized models.

Set the COHERE_API_KEY environment variable. If you don't have one yet, you can get it from the Cohere dashboard.

Groq

Groq provides models powered by their LPU inference engine.

Set the GROQ_API_KEY environment variable. If you don't have one yet, you can get it from the Groq console.

License

MIT

About

AI pal for the command line.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages