hpcGPT is a customized CLI built on top of the Opencode agent that integrates Model Context Protocol (MCP) servers for Slurm-based HPC environments, Illinois Chat documentation Q&A, reporting, and Atlassian.
curl -fsSL https://opencode.ai/install | bash
export OPENCODE_CONFIG=/absolute/path/to/this/repo/opencode.jsonc
opencodeSet environment variables as needed (see Env section below), then pick a model and use tools from the TUI.
- Slurm integration (MCP):
sinfo,squeue,scontrol, andaccountsviaslurm-mcp-server. - Docs Q&A (MCP): Illinois Chat tools
delta-docs,delta-ai-docsfor Delta/Delta AI documentation. - Atlassian integration (MCP): Containerized Jira/Confluence tools with flexible auth modes.
- Support reporting:
report-serversends concise support reports with context. - Multiple providers: NCSA Hosted and NCSA Ollama providers selectable in
opencode.jsonc. - Config-driven: Everything wired through
opencode.jsoncfor reproducibility.
graph TD
U[User] -->|TUI| OC[Opencode Agent]
OC --> P1[NCSA Hosted Provider]
OC --> P2[NCSA Ollama Provider]
subgraph MCP_Servers
M1[slurm-mcp-server]
M2[illinois-chat-server]
M3[report-server]
M4[atlassian-mcp-server container]
end
OC -. tools .-> M1
OC -. tools .-> M2
OC -. tools .-> M3
OC -. tools .-> M4
M1 --> SLURM[Slurm CLI]
M2 --> ICHAT[Illinois Chat API]
M4 --> JIRA[Jira]
M4 --> CONF[Confluence]
M3 --> SUPPORT[Delta Support]
- Opencode reads
opencode.jsoncfor providers, models, and MCP servers. - MCP servers expose tools over stdio; the agent calls them when the model chooses a tool.
slurm-mcp-servershells out to local Slurm commands.illinois-chat-servercalls the Illinois Chat API to answer questions from Delta/Delta AI docs.atlassian-mcp-serverruns via Apptainer and exposes Jira/Confluence tools.report-servercan send a compact support report with session context.
hpcgpt/
mcp_servers/
illinois_chat_server/
src/index.ts
package.json
slurm_mcp_server/
src/index.ts
package.json
prompts/
support.txt
opencode.jsonc
example.env
example.env.atlassian
README.md
favicon.png
-
slurm-mcp (local)
- Tools:
accounts,sinfo,squeue,scontrol - Purpose: query accounts, node/partition status, user jobs, and job details.
- Tools:
-
illinois-chat-mcp (local)
- Tools:
delta-docs,delta-ai-docs - Purpose: answer questions from Delta and Delta AI documentation (requires
ILLINOIS_CHAT_API_KEY).
- Tools:
-
report-server (local)
- Tools:
send_support_report - Purpose: email a concise support report with conversation history and system info to the Delta support team.
- Tools:
-
atlassian-mcp-server (container)
- Tools (examples): Jira —
jira_get_issue,jira_search_issues,jira_create_issue,jira_add_comment,jira_transition_issue; Confluence —confluence_search,confluence_get_page,confluence_create_page,confluence_update_page(availability depends on config/read-only mode). - Purpose: interact with Jira and Confluence for tickets and docs. See the Atlassian MCP project for details:
https://github.com/sooperset/mcp-atlassian
- Tools (examples): Jira —
Install Opencode and point it at this repo’s config:
curl -fsSL https://opencode.ai/install | bash
export OPENCODE_CONFIG=/absolute/path/to/this/repo/opencode.jsonc
opencodeMCP servers in mcp_servers/* use Bun/Node. From each server directory:
bun install
bun run build
bun run startOpencode will also launch them automatically from the opencode.jsonc mcp section when enabled.
Use example.env and example.env.atlassian as references. Export directly or create a .env/.env.atlassian.
NCSA_LLM_URL– Base URL for NCSA Hosted models providerILLINOIS_CHAT_API_KEY– Required forillinois-chat-server
Configure .env.atlassian (see example.env.atlassian). Common options:
JIRA_URL,CONFLUENCE_URL- One of: personal token, API token, or OAuth BYOT
- Optional:
READ_ONLY_MODE,ENABLED_TOOLS, proxy settings
Inside the Opencode TUI, pick a model (e.g., ncsahosted/Qwen/Qwen3-VL-32B-Instruct) and ask the assistant to use tools.
"Check the Delta GPU partitions and my running jobs."
The assistant will call sinfo and squeue via slurm-mcp-server.
"How do I submit a Slurm job on Delta?"
The assistant will call delta-docs with your question and return a synthesized answer with citations when available.
Run the report command in Opencode. This uses the send_support_report tool to email a concise summary to Delta support.
See opencode.jsonc for providers, models, and MCP server commands. Example provider entries:
{
"provider": {
"ncsahosted": {
"npm": "@ai-sdk/openai-compatible",
"name": "my_provider_name",
"options": {
"baseURL": "{env:my_url}" // load the url from a environment variable
},
"models": {
"Qwen/Qwen3-VL-32B-Instruct": {
"name": "my_model_name",
"options": {
"stream":true
}
}
}
}
}
}- Delta Chatbot:
https://uiuc.chat/Delta-Documentation(course: Delta-Documentation) - Delta AI Chatbot:
https://uiuc.chat/DeltaAI-Documentation(course: DeltaAI-Documentation)
MIT – see LICENSE.