File tree Expand file tree Collapse file tree
Expand file tree Collapse file tree Original file line number Diff line number Diff line change @@ -186,6 +186,7 @@ eng_plans/
186186
187187# RL training artifacts
188188rl /models /* .zip
189+ ! rl /models /cuttle_rl_final.zip
189190rl /logs /
190191
191192.DS_Store
Original file line number Diff line number Diff line change 1414
1515run-with-rl :
1616 source $(VENV_NAME ) /bin/activate && PYTHONPATH=$(CURRENT_DIR ) python main_with_rl_ai.py
17+
18+ # Dockerized dev environment (backend + Vite)
19+ dev :
20+ docker compose -f docker-compose.dev.yaml up --build -d
21+
22+ dev-down :
23+ docker compose -f docker-compose.dev.yaml down
24+
1725# Generate documentation using pdoc
1826docs :
1927 source $(VENV_NAME ) /bin/activate && PYTHONPATH=$(CURRENT_DIR ) python docs.py
@@ -57,4 +65,4 @@ test-rl:
5765 @echo " Quick RL training test with action masking (10K timesteps, ~2-3 minutes)..."
5866 source $(VENV_NAME ) /bin/activate && PYTHONPATH=$(CURRENT_DIR ) python -c \
5967 " from rl import config; config.TRAINING_CONFIG['total_timesteps'] = 10000; \
60- exec(open('rl/train.py').read ())"
68+ exec(open('rl/train.py').read ())"
Original file line number Diff line number Diff line change 22
33
44# Set Up
5+ ## Local dev (no Docker)
56## Create a virtual environment
67
78``` bash
89python3 -m venv cuttle-bot-3.12
910source ./cuttle-bot-3.12/bin/activate
1011```
1112
13+ Or use the Makefile helper (requires ` python3.12 ` on PATH):
14+
15+ ``` bash
16+ make setup
17+ ```
1218
1319## Install requirements
1420
1521``` bash
1622pip install -r requirements.txt
1723```
1824
25+ ## Run the dev servers
26+
27+ Backend API (FastAPI + reload):
28+
29+ ``` bash
30+ uvicorn server.app:app --reload --host 0.0.0.0 --port 8000
31+ ```
32+
33+ Frontend (Vite):
34+
35+ ``` bash
36+ cd web && npm run dev
37+ ```
38+
39+ Open http://localhost:5173
40+
41+ ## Docker dev (hot reload)
42+
43+ ``` bash
44+ make dev
45+ ```
46+
47+ Open http://localhost:5173 (API at http://localhost:8000 ).
48+
1949## Set up AI player
2050
51+ The game currently supports two types of AI players: RL based AI, and LLM based AI.
52+
53+ ### RL based AI
54+
55+ The repo comes with a model zip file which is loaded into the game server.
56+
57+ The model can be trained locally (see later sections).
58+
59+ ### LLM Based AI
60+
2161The AI player uses ollama to generate actions. You'll need to install ollama and set up a model.
2262
2363Follow the installation guide here: https://github.com/ollama/ollama
@@ -101,3 +141,4 @@ Adjust training parameters in `rl/config.py`:
101141- Models saved to: ` rl/models/ `
102142- Training logs: ` rl/logs/ ` (view with TensorBoard)
103143- Checkpoints every 10K timesteps
144+ - Checkpoints are gitignored, but the final model named ` cuttle_rl_final.zip ` is tracked with version control (git)
Load Diff This file was deleted.
You can’t perform that action at this time.
0 commit comments