Skip to content

Document Tilebox Docs MCP server #47

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 7, 2025
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 10 additions & 23 deletions ai-assistance.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ description: Large Language Models (LLMs) are powerful tools for exploring and l
icon: brain-circuit
---

## Providing Tilebox-specific context
## Providing complete Tilebox context

AI assistants and Large Language Models (LLMs) can answer questions, guide you in using Tilebox, suggest relevant sections in the documentation, and assist you in creating workflows.

Expand All @@ -18,31 +18,18 @@ The full content of the Tilebox documentation is available in plain markdown for
The [Documentation Context](https://docs.tilebox.com/llms-full.txt) is updated whenever the documentation changes. If you download the file, refresh it occasionally to stay up-to-date.
</Tip>

## Example prompt
## Providing tailored context via MCP

After you upload the [Documentation Context](https://docs.tilebox.com/llms-full.txt) to your AI assistant or LLM, you can ask it questions and receive tailored, up-to-date responses.
Tilebox Docs can be installed as an MCP tool, so an MCP client can ask for detailed context on specific topics.

Here's an example prompt to get you started.
Run the following command to generate an MCP server for Tilebox Docs.

```plaintext Example prompt
Generate a script that

- Creates a cluster
- Configures console logging
- Contains a Fibonacci calculator workflow that is using a local filesystem cache.
Make sure you get task dependencies right.
Make sure to only print the final result as the last step of the workflow.
Write logs to let us know what is happening.
- Submits a job for the number 7
- Starts a processing runner

Do not invent APIs. All available Tilebox APIs are documented.
```bash
npx mint-mcp add tilebox
```

## Claude

[Claude 3.5 Sonnet](https://docs.anthropic.com/en/docs/about-claude/models) is a great choice for an AI assistant for using Tilebox. To provide Claude with Tilebox-specific context, create a [new project](https://support.anthropic.com/en/articles/9517075-what-are-projects), and upload the [llms-full.txt](https://docs.tilebox.com/llms-full.txt) as project knowledge. You can then ask questions or use it to generate scripts.
The command line tool will guide you along the installation process for Cursor, Windsurf, Claude Code, Augment Code or other MCP clients.

## ChatGPT

Experiments with [GPT-4o](https://chatgpt.com/?model=gpt-4o) have shown mixed results. While it effectively answers questions, it has difficulties generating workflow scripts, often misapplying or inventing APIs. Consider using [Claude 3.5 Sonnet](#claude) for a better experience.
<Tip>
The MCP server always retrieves the most up to date version of the documentation.
</Tip>