Tool type: Open-source self-hosted AI agent (CLI / Node.js service) Also known as: ClawdBot, Moltbot GitHub: github.com/clawdbot Transport support: stdio, SSE, Streamable HTTP (via MCPorter)
OpenClaw (formerly ClawdBot, formerly Moltbot) is an open-source continuous AI agent runtime that runs as a persistent Node.js service on your local machine. Unlike chatbots that respond to one-off prompts, OpenClaw:
- Runs 24/7 in the background, executing tasks proactively via cron jobs and event listeners
- Connects to messaging platforms (WhatsApp, Telegram, Discord, Slack, iMessage) as interfaces
- Executes MCP tools via MCPorter — its built-in MCP management layer
- Maintains persistent memory and context across sessions
With Pieces MCP connected, OpenClaw gains access to your Long-Term Memory. It can autonomously query your past work, generate standups, monitor recent activity, and surface relevant context without you asking.
- Node.js 18+ installed
- OpenClaw cloned and configured: follow the OpenClaw setup guide
- PiecesOS running locally (port 39300-39333)
~/.openclaw/workspace/config/mcporter.json
Edit ~/.openclaw/workspace/config/mcporter.json:
Recommended when OpenClaw and PiecesOS are running on the same machine. The localhost URL is fastest and needs no extra setup. Uses
mcp-remoteas a stdio bridge — see the mcp-remote guide or stdio-to-HTTP Bridges guide for documentation.
{
"mcpServers": {
"pieces": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"http://localhost:39300/model_context_protocol/2024-11-05/sse"
]
}
}
}{
"mcpServers": {
"pieces": {
"type": "sse",
"url": "http://localhost:39300/model_context_protocol/2024-11-05/sse"
}
}
}Use this when OpenClaw needs to reach PiecesOS on a different machine (e.g., your main dev machine while OpenClaw runs on a server). For ngrok setup, see: Connecting to PiecesOS from the Outside World via Ngrok.
{
"mcpServers": {
"pieces": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"https://YOUR_NGROK_URL.ngrok.app/model_context_protocol/2024-11-05/sse"
]
}
}
}Once Pieces MCP is connected to OpenClaw, you can automate workflows like:
Autonomous daily standup:
Schedule OpenClaw to run every morning, query yesterday's workstream summaries via
material_identifiers+workstream_summaries_batch_snapshot, and post a formatted standup to your Slack or Teams channel.
Meeting prep:
Before a calendar event, OpenClaw searches audio transcriptions and workstream summaries for context related to the meeting topic and drafts a brief for you.
Automated debugging log:
When OpenClaw detects a production alert, it queries recent
workstream_events_full_text_searchfor error-related clipboard content, screenshots, or transcriptions, and creates apieces_memoryentry with the incident context.
- Start OpenClaw
- Ask via your connected messaging platform: "What Pieces tools do you have?"
- Pieces LTM tools should be listed
- Try: "What did I work on yesterday?" — OpenClaw will call
ask_pieces_ltm
OpenClaw can run with permissionMode: 'bypassPermissions' to execute tools autonomously. When combined with Pieces MCP write tools (like create_pieces_memory), this is powerful but should be used carefully. Consider:
- Running OpenClaw in Docker with limited filesystem access
- Disabling write tools in MCPorter if running fully autonomously
- Monitoring execution logs
Edit ~/.openclaw/workspace/config/mcporter.json, update the URL, and restart OpenClaw.
| Issue | Solution |
|---|---|
| MCPorter config not found | Create ~/.openclaw/workspace/config/ directory manually |
| Bridge process not starting | Install Node.js; verify npx is in PATH |
| Tools not available | Restart OpenClaw after editing MCPorter config |
| ngrok URL expired | Restart the ngrok tunnel and update the URL in MCPorter config |
- Pieces MCP and LTM Tools Reference — Complete reference for all 39 tools available to your agents
- Connecting to PiecesOS via Ngrok — Expose your local Pieces server for remote access
| ← Back to All Agent Setup Guides |