Skip to content

chloe-pomegranate/gossip

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

gossip ✦

a cute terminal CLI for chatting with LLMs. supports one-shot queries, interactive chat, streaming, file output, and named profiles for switching between API providers.

features

  • streaming — responses print token by token as they arrive
  • profiles — switch between providers and models with named configs
  • system prompts — persistent per-profile or one-off with --system
  • stdin pipinggit diff | gossip ask "review this"
  • file inputgossip ask "explain this" --from main.rs
  • markdown rendering — headers, bold, italic, lists, and inline code styled in the terminal
  • syntax highlighting — fenced code blocks highlighted with language detection

install

cargo install --path .

quick start

# first run creates a config at ~/.config/gossip/config.toml
# edit it with your API key
gossip profile list

# one-shot question (streams to stdout)
gossip ask "what should I wear today?"

# save response to a file
gossip ask "write me a haiku about this season's runways" --out haiku.md

# interactive chat with full conversation history
gossip chat

chat

start an interactive session with full conversation history — the model remembers what you've said:

gossip chat
gossip chat --system "respond only in JSON"
gossip chat --profile local

type exit, quit, or hit ctrl+d to leave.

profiles

gossip supports multiple API profiles so you can switch between providers and models.

# list profiles (default is marked with ✦)
gossip profile list

# add a new profile interactively
gossip profile add

# change the default
gossip profile set-default local

# use a specific profile for one command
gossip ask "hello" --profile openai

config

stored at ~/.config/gossip/config.toml:

default_profile = "default"

[profiles.default]
api_key = "sk-ant-..."
base_url = "https://api.anthropic.com/v1"
model = "claude-sonnet-4-20250514"
system_prompt = "you are a fashion consultant"

[profiles.local]
api_key = "none"
base_url = "http://localhost:8080/v1"
model = "qwen3.5-35b-instruct-q5_k_m"

[profiles.openai]
api_key = "sk-..."
base_url = "https://api.openai.com/v1"
model = "gpt-5.4-2026-03-05"
system_prompt = "respond in JSON only"

any provider with an OpenAI-compatible /chat/completions endpoint works (Moonshot etc).

system prompts

set a persistent system prompt per profile in your config:

[profiles.bestie]
api_key = "sk-ant-..."
base_url = "https://api.anthropic.com/v1"
model = "claude-sonnet-4-20250514"
system_prompt = "be casual and fun"

or pass one on the fly with --system (overrides the profile's prompt):

gossip ask "what should i wear today" --system "you are a fashion consultant"
gossip chat --system "respond only in french"

piping & file input

gossip reads from stdin, so you can pipe anything into it:

# pipe a file
cat error.log | gossip ask "what went wrong"

# pipe command output
git diff | gossip ask "review these changes"

# pipe with no question — stdin becomes the message
echo "hello" | gossip ask

use --from to read from a file directly:

# send a file as the message
gossip ask --from error.log

# ask a question about a file
gossip ask "explain this code" --from src/main.rs

# combine with system prompt
gossip ask --from data.csv --system "analyse this data and summarise the trends"

you can combine all three — question, --from, and stdin are joined together:

cat logs.txt | gossip ask "compare these" --from previous-logs.txt

output

responses stream to stdout token by token. use --out to save to a file instead:

gossip ask "write a blog post about the balenciaga city bag" --out post.md

markdown in responses is rendered with terminal styling — headers, bold, italic, inline code, bullet lists, and fenced code blocks with syntax highlighting.

reference

ask

usage description
gossip ask "question" one-shot query, streams to stdout
gossip ask "question" --out file.md save response to file
gossip ask --from file.md send file contents as the message
gossip ask "explain" --from file question + file contents
cat file | gossip ask "question" pipe stdin + question
gossip ask "q" --system "prompt" with a system prompt

chat

usage description
gossip chat interactive multi-turn chat
gossip chat --system "prompt" chat with a system prompt

profiles

usage description
gossip profile list list all saved profiles
gossip profile add create a new profile
gossip profile set-default <name> change the default profile

global flags

flag description
--profile <name> use a specific profile for this command
--version print version

license

MIT

About

cute terminal CLI for chat / output from LLMs. streaming, profiles, stdin piping, syntax highlighting.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Languages