Skip to content

Commit d277fc8

Browse files
Merge pull request #59 from amd/dholanda/supplemental_redef
Update Supplemental and Backup Playbook Lists
2 parents 0ad81f0 + d8f1e7b commit d277fc8

47 files changed

Lines changed: 226 additions & 108 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.
Lines changed: 3 additions & 0 deletions
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
{
2+
"id": "dify-ai-agents",
3+
"title": "Build Node Based AI Agents and RAG Workflows with Dify",
4+
"description": "Create AI agents and RAG workflows using Dify's visual node editor with llama.cpp on your STX Halo™",
5+
"time": 60,
6+
"platforms": ["linux", "windows"],
7+
"difficulty": "intermediate",
8+
"isNew": false,
9+
"isFeatured": false,
10+
"published": true,
11+
"tags": ["dify", "agents", "rag", "llamacpp", "workflows"]
12+
}
Lines changed: 3 additions & 0 deletions
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
{
2+
"id": "gguf-quantization-export",
3+
"title": "Quantize and Export Models to GGUF",
4+
"description": "Learn how to quantize and export models to GGUF format using llama.cpp on your STX Halo™",
5+
"time": 60,
6+
"platforms": ["linux", "windows"],
7+
"difficulty": "intermediate",
8+
"isNew": false,
9+
"isFeatured": false,
10+
"published": true,
11+
"tags": ["gguf", "quantization", "llamacpp", "model-export"]
12+
}
Lines changed: 1 addition & 1 deletion
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
{
2+
"id": "local-foundry",
3+
"title": "Local Foundry",
4+
"description": "Set up and use Local Foundry on your STX Halo™",
5+
"time": 60,
6+
"platforms": ["linux", "windows"],
7+
"difficulty": "intermediate",
8+
"isNew": true,
9+
"isFeatured": false,
10+
"published": true,
11+
"tags": ["foundry", "local", "llm"]
12+
}
Lines changed: 3 additions & 0 deletions
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
{
2+
"id": "mcp-research-agent",
3+
"title": "Building a Research Agent Using MCP",
4+
"description": "Build a research agent using the Model Context Protocol (MCP) on your STX Halo™",
5+
"time": 60,
6+
"platforms": ["linux", "windows"],
7+
"difficulty": "intermediate",
8+
"isNew": true,
9+
"isFeatured": false,
10+
"published": true,
11+
"tags": ["mcp", "agents", "research", "llm"]
12+
}
Lines changed: 1 addition & 1 deletion
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
{
2+
"id": "ollama-getting-started",
3+
"title": "Getting Started with Ollama",
4+
"description": "Install and run LLMs locally using Ollama on your STX Halo™",
5+
"time": 30,
6+
"platforms": ["linux", "windows"],
7+
"difficulty": "beginner",
8+
"isNew": false,
9+
"isFeatured": false,
10+
"published": true,
11+
"tags": ["ollama", "llm", "inference", "local"]
12+
}

0 commit comments

Comments
 (0)