PromptExecution agents now exchange lightweight JSON chat payloads over two transports:
- 🧵 Local IPC – Unix domain socket at
~/.b00t/chat.channel.socket - 📡 NATS stub – retains subject namespacing for future federation
{
"channel": "mission.delta",
"sender": "frontend.agent",
"body": "handoff complete",
"metadata": {"ticket": "OPS-123"},
"timestamp": "2025-03-04T05:30:00Z"
}channelkeeps conversations scoped per mission/crew.senderis free-form but SHOULD stay unique inside a channel.bodyis plain text; additional structure belongs inmetadata.timestampMUST be RFC 3339; producer SHOULD use UTC.
- b00t-mcp boots and calls
b00t_chat::spawn_local_server. - The server binds the socket, accepts JSON lines, and queues them inside
ChatInbox. - Before any command response is emitted, the MCP server drains the inbox and
appends
<🥾>{ "chat": { "msgs": N }}</🥾>to the outgoing payload. - Drained messages are logged (channel, sender, body) so operators can stitch context.
Consumers MUST write newline-delimited JSON to the socket. The helper client in
b00t-cli chat send already handles serialization, flushing, and fallbacks.
Until credentials are provisioned, the NATS transport simply logs intent. The
subject prefix matches historical ACP conventions: chat.{channel}. Swapping in
async-nats will require exporting JWT-authenticated configuration from the MCP
environment.
# Local sockets (default)
b00t-cli chat send --channel mission.delta --message "artifact staged"
# Explicit transport selection
b00t-cli chat send --transport nats --message "deploying" \
--metadata '{"env":"prod"}'
# Discover socket path
b00t-cli chat infoSlash messages prefixed with /b00t can be translated into CLI invocations via
b00t_chat::parse_b00t_command. The initial model-management verbs match the
new datum workflow:
/b00t model list # Inspect cached datums
/b00t model download llava # Delegate to `b00t-cli model download llava`
/b00t model env deepseek # Emit env exports for direnv blending
/b00t model remove llava # Remove cached weights (`--yes` implied)
/b00t model serve llava # Launch default vLLM container for LLaVA
/b00t model stop # Stop the active model container
Chat front-ends can call BootCommand::to_cli_args() to hand the parsed action
off to the existing b00t-cli model … commands.
- Rename workspace crate to
b00t-chat - Replace
acpCLI command tree withchat - Start chat listener inside
b00t-mcpand surface unread counts - Update Docker scaffolding and docs to reflect the new library
Future work SHOULD cover:
- Wiring a broadcast-friendly listener so dedicated viewers can tail messages.
- Completing the NATS transport with authentication + reconnect handling.
- Extending metadata conventions (e.g. thread IDs, attachments) once the socket protocol stabilizes.