A tiny command-line AI agent that calls tools using Zod schemas, persists conversation history in a JSON file, and renders a friendly CLI with spinners and colored logs. It uses the OpenAI Node SDK against a GitHub Models-compatible endpoint by default.
- Executes an agent loop that decides between replying or calling one or more tools.
- Defines tools with Zod schemas and executes them via a simple router.
- Persists all messages to
db.jsonso the agent has short-term memory. - Shows progress with a spinner and clearly logs user/assistant turns.
Built-in tools:
generate_image: Generates an image with DALL·E 3 and returns a URL.dad_joke: Fetches a random dad joke.reddit: Fetches the latest posts fromr/awwas JSON.
- Node.js 18+ (recommended). The project runs with
tsxand the OpenAI SDK. - An API token for the configured endpoint.
- Default base URL is
https://models.github.ai/inference(GitHub Models). Use a GitHub token with Models access inOPENAI_API_KEY. - To use OpenAI’s API instead, change
baseURLinsrc/ai.tstohttps://api.openai.com/v1and setOPENAI_API_KEYto your OpenAI key.
- Default base URL is
- Install dependencies:
npm install
- Create a
.envfile with your API key:echo "OPENAI_API_KEY=your_token_here" > .env
You can pass your request as a single quoted argument.
npm start -- "hi, am abel. what is the weather in my current place"Or directly with tsx:
npx tsx index.ts "tell me a dad joke"Examples that trigger tools:
- Image generation: "generate an image of a corgi surfing at sunset"
- Dad joke: "tell me a dad joke"
- Reddit: "get the latest cute animal posts"
The agent stores all turns in db.json. Delete that file to reset context.
index.ts: CLI entry that forwards the prompt to the agent.src/agent.ts: Agent loop, spinner handling, message logging, and termination logic.src/llm.ts: LLM call using OpenAI SDK with Zod tool definitions.src/toolRunner.ts: Routes tool calls to concrete implementations.src/memory.ts: Persists and retrieves messages via LowDB (db.json).tools/: Tool definitions (Zod) and implementations.
- Create a file in
tools/exporting a Zod-based definition and a function:export const myToolDefinition = { name: 'my_tool', parameters: z.object({...}) }export const myTool: ToolFn<Args, string | object> = async ({ userMessage, toolArgs }) => { ... }
- Add the definition to
tools/index.tsso it’s exposed to the model. - Add a
caseinsrc/toolRunner.tsto invoke your tool.
Types you’ll use are in types.ts (ToolFn and AIMessage).
- Spinner never stops or the agent doesn’t terminate:
- The loop should execute
tool_callsfirst and only exit when the model returns assistantcontentwithout tool calls. Checksrc/agent.tstermination order.
- The loop should execute
- Tool not found errors:
- Ensure the tool definition is added to
tools/index.tsand the router case exists insrc/toolRunner.ts.
- Ensure the tool definition is added to
- Auth errors:
- Verify
.envhas a validOPENAI_API_KEYfor the configuredbaseURL.
- Verify
- Resetting state:
- Delete
db.jsonand re-run.
- Delete
- The CLI uses
orafor spinners and colorized logs insrc/ui.ts. - Memory uses
lowdb; all messages (user/assistant/tool) are written todb.json. - Default model is
gpt-4o-mini(seesrc/llm.ts). You can change the model there.