Skip to content

Commit 01183a3

Browse files
williamwclaude
andauthored
Migrate deepseek-qwen-autogen-agent from Magic to Pixi (#68)
* Migrate deepseek-qwen-autogen-agent from Magic to Pixi ## Summary - Replaced Magic CLI with Pixi package manager throughout the recipe - Replaced global max-pipelines installation with modular package dependency - Updated all references from "MAX Serve" to "MAX" for consistency ## Changes - .gitignore: Updated from "magic environments" to "Pixi environments", added pixi.lock - README.md: - Replaced Magic installation instructions with Pixi - Removed `magic global install -u max-pipelines` command - Changed all `magic run` commands to `pixi run` - Updated from `magic init` to `git clone` for recipe download - Renamed "MAX Serve" to "MAX" throughout - pyproject.toml: - Added `modular = ">=25.5.0.dev2025070905,<26"` dependency to replace max-pipelines - Reordered channels to prioritize max-nightly channel - metadata.yaml: - Updated tasks from `magic run` to `pixi run` - Updated titles from "MAX Serve" to "MAX" 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com> * Fix system requirements link to use FAQ page Updated system requirements reference to point to the FAQ page as per PR 65 format: https://docs.modular.com/max/faq/#system-requirements 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com> * Fix rich dependency version conflict in deepseek-qwen-autogen-agent Update rich dependency constraint from <14 to <15 to resolve CI test failures caused by version conflict with rich==14.0.0. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com> --------- Co-authored-by: Claude <noreply@anthropic.com>
1 parent 795ab9f commit 01183a3

File tree

4 files changed

+28
-32
lines changed

4 files changed

+28
-32
lines changed
Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,5 @@
11

2-
# pixi environments
2+
# Pixi environments
33
.pixi
44
*.egg-info
5-
# magic environments
6-
.magic
5+
pixi.lock

deepseek-qwen-autogen-agent/README.md

Lines changed: 16 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
1-
# Learn How to Build AI Agents with DeepSeek-R1, AutoGen and MAX Serve
1+
# Learn How to Build AI Agents with DeepSeek-R1, AutoGen and MAX
22

33
This recipe demonstrates how to build AI agents using:
44

55
* [`DeepSeek-R1-Distill-Qwen-7B`](https://builds.modular.com/models/DeepSeek-R1-Distill-Qwen/7B) model that runs on GPU
66
* [AutoGen](https://microsoft.github.io/autogen/stable/) framework for multi-agent conversations
7-
* [MAX Serve](https://docs.modular.com/max/serve/) for efficient model serving and inference
7+
* [MAX](https://docs.modular.com/max/serve/) for efficient model serving and inference
88
* [Rich](https://rich.readthedocs.io/en/stable/introduction.html) Python library for beautiful terminal interfaces
99

1010
We'll create two example applications that showcase:
@@ -22,24 +22,18 @@ The patterns demonstrated here can be adapted for various agent-based applicatio
2222

2323
## Requirements
2424

25-
Please make sure your system meets our [system requirements](https://docs.modular.com/max/get-started).
25+
Please make sure your system meets our [system requirements](https://docs.modular.com/max/faq/#system-requirements).
2626

27-
To proceed, ensure you have the `magic` CLI installed with the `magic --version` to be **0.7.2** or newer:
27+
To proceed, ensure you have the `pixi` CLI installed:
2828

2929
```bash
30-
curl -ssL https://magic.modular.com/ | bash
30+
curl -fsSL https://pixi.sh/install.sh | sh
3131
```
3232

33-
or update it via:
33+
...and updated to the latest version:
3434

3535
```bash
36-
magic self-update
37-
```
38-
39-
Then install `max-pipelines` via:
40-
41-
```bash
42-
magic global install -u max-pipelines
36+
pixi self-update
4337
```
4438

4539
### GPU requirements
@@ -52,33 +46,33 @@ This recipe requires a GPU with CUDA 12.5 support. Recommended GPUs:
5246

5347
## Quick start
5448

55-
1. Download the code for this recipe using the `magic` CLI:
49+
1. Download the code for this recipe:
5650

5751
```bash
58-
magic init deepseek-qwen-autogen-agent --from modular/max-recipes/deepseek-qwen-autogen-agent
59-
cd deepseek-qwen-autogen-agent
52+
git clone https://github.com/modularml/max-recipes.git
53+
cd max-recipes/deepseek-qwen-autogen-agent
6054
```
6155

62-
2. Run the MAX Serve server via in a terminal:
56+
2. Run the MAX server via in a terminal:
6357

6458
**Make sure the port `8010` is available. You can adjust the port settings in [pyproject.toml](./pyproject.toml).**
6559

6660
```bash
67-
magic run server
61+
pixi run server
6862
```
6963

7064
3. In a new terminal, run either example:
7165

7266
* For the chat agent:
7367

7468
```bash
75-
magic run chat_agent
69+
pixi run chat_agent
7670
```
7771

7872
* For the screenplay development team:
7973

8074
```bash
81-
magic run screenplay_agents
75+
pixi run screenplay_agents
8276
```
8377

8478
The agents will be ready when you see the welcome message in your terminal.
@@ -138,7 +132,7 @@ assistant = AssistantAgent(
138132
139133
Key features:
140134
141-
* Uses DeepSeek-R1 model through MAX Serve
135+
* Uses DeepSeek-R1 model through MAX
142136
* Configurable temperature for response creativity
143137
* Adjust `max_tokens` for longer conversations
144138
@@ -399,7 +393,7 @@ This recipe can be adapted for various applications:
399393
Common issues and solutions:
400394
401395
1. **Server Connection Issues**
402-
* Ensure MAX Serve is running (`magic run server`)
396+
* Ensure MAX is running (`pixi run server`)
403397
* Check if the default port `8010` is available
404398
* Verify network connectivity
405399

deepseek-qwen-autogen-agent/metadata.yaml

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
version: 1.0
2-
long_title: "Learn How to Build AI Agents with DeepSeek-R1, AutoGen and MAX Serve"
3-
short_title: "AI Agent with DeepSeek-R1, AutoGen and MAX Serve"
2+
long_title: "Learn How to Build AI Agents with DeepSeek-R1, AutoGen and MAX"
3+
short_title: "AI Agent with DeepSeek-R1, AutoGen and MAX"
44
author: "Ehsan M. Kermani"
55
author_image: "author/ehsan.jpg"
66
author_url: "https://www.linkedin.com/in/ehsanmkermani/"
@@ -15,6 +15,6 @@ tags:
1515
- autogen
1616

1717
tasks:
18-
- magic run server
19-
- magic run chat_agent
20-
- magic run screenplay_agents
18+
- pixi run server
19+
- pixi run chat_agent
20+
- pixi run screenplay_agents

deepseek-qwen-autogen-agent/pyproject.toml

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
authors = [{ name = "Modular Inc", email = "hello@modular.com" }]
3-
dependencies = ["autogen-agentchat==0.4.7", "autogen-ext[openai]==0.4.7", "rich>=13.9.4,<14", "python-chess>=1.999,<2", "tenacity>=9.0.0,<10", "requests>=2.32.3,<3"]
3+
dependencies = ["autogen-agentchat==0.4.7", "autogen-ext[openai]==0.4.7", "rich>=13.9.4,<15", "python-chess>=1.999,<2", "tenacity>=9.0.0,<10", "requests>=2.32.3,<3"]
44
name = "deepseek-qwen-autogen-agent"
55
requires-python = ">=3.10,<3.13"
66
version = "0.0.0"
@@ -16,12 +16,15 @@ requires = ["hatchling"]
1616
packages = ["."]
1717

1818
[tool.pixi.project]
19-
channels = ["conda-forge", "https://conda.modular.com/max-nightly"]
19+
channels = ["https://conda.modular.com/max-nightly", "conda-forge"]
2020
platforms = ["linux-64", "linux-aarch64", "osx-arm64"]
2121

2222
[tool.pixi.pypi-dependencies]
2323
deepseek_qwen_autogen_agent = { path = ".", editable = true }
2424

25+
[tool.pixi.dependencies]
26+
modular = ">=25.5.0.dev2025070905,<26"
27+
2528
[tool.pixi.tasks]
2629
server = "MAX_SERVE_PORT=8010 max serve --model-path deepseek-ai/DeepSeek-R1-Distill-Qwen-7B --max-length 16384 --max-batch-size 1"
2730
chat_agent = "python chat_agent.py"

0 commit comments

Comments
 (0)