A command-line interface for interacting with Ollama AI models.
- Store conversations as files
- Add context to a session with
-f/--fileflag - Use commands to modify and customize the current session
- Prompts are built in the following way:
- System prompt
- Context file
- History file
- Current user prompt
git clone https://github.com/mituuz/silent-llama.git
cd silent-llama
cargo build --releasesllama <HISTORY_FILE> [OPTIONS]<HISTORY_FILE>- Path to the file that acts as chat history (will be created if it doesn't exist)- If a relative path is provided, it will be created inside the
sllama_dirdirectory - If an absolute path is provided, it will be used as-is regardless of
sllama_dir
- If a relative path is provided, it will be created inside the
-f, --file <INPUT_FILE>- Optional to be used as context for each chat message-h, -help- Print help-v, --version- Print version
# Start a new conversation saving history to chat.txt
sllama chat.txt
# Continue a conversation with additional context from code.rs
sllama chat.txt -f code.rsCommands can be entered during a chat by prepending the command with :. Commands are case-insensitive.
List available commands.
:help
List all files in sllama_dir, optionally add a filter string.
:list <filter>
Switch to a different history file. Supports either absolute or relative paths (from sllama_dir).
:switch relative/path
:switch /absolute/path
Open the current history file in the user's editor.
$EDITOR$VISUAL- windows -
notepad(untested) - other -
vi
:edit
Exit the current chat.
:q
Update the system prompt for this session. Does not modify any configurations.
:sysprompt Enter the new system prompt here
You can configure your sllama by creating and modifying TOML configuration located at ~/.sllama.toml/
%USERPROFILE%\.sllama.toml.
Switch rustyline input mode between Emacs and Vi.
Default: Emacs
Ollama model used
Default: gemma3:12b
Path to the sllama directory. This will hold new history files by default.
Default: ~/sllama
System prompt that configures the AI assistant's behavior.
Default:
You are an AI assistant receiving input from a command-line
application called silent-llama (sllama). The user may include additional context from another file.
This supplementary content appears after the system prompt and before the history file content.
Your responses are displayed in the terminal and saved to the history file.
Keep your answers helpful, concise, and relevant to both the user's direct query and any file context provided.
You can tell where you have previously responded by --- AI Response --- (added automatically).
- Clarify how the prompt is formed
- Add a configuration file
- Integrate rustyline
- Implement completions with rustyline (commands and files)
- Support multiline input with shift + enter (using rustyline)
- Use
ollama serverand API calls instead - Allow changing the context file during a chat
- Add support for knowledge directory
- Re-implement AI response interruption
- Add functionality to truncate a chat
- Keep track of the model's context window and file size
- Create memories, which are included in the prompt by default (session/global)