This repository implements AI agents for the Dhumbal card game, a South Asian multiplayer game with imperfect information. The project, detailed in "AI Agents for the Dhumbal Card Game: A Comparative Study", evaluates rule-based (Aggressive, Conservative, Balanced, Opportunistic), search-based (MCTS, ISMCTS), learning-based (DQN, PPO), and random agents through 1024-round tournaments. It includes game environment, agent implementations, statistical analysis, and visualizations, supporting game AI research and cultural preservation.
- Game Environment: Python-based Dhumbal implementation (2–5 players).
- AI Agents: Random, rule-based, MCTS, ISMCTS, DQN, PPO.
- Evaluation: Win rate, economic performance, Jhyap success, decision efficiency.
- Analysis: Welch’s t-tests, Cohen’s d, 95% CI, visualized via Jupyter notebooks.
- Reproducibility: Fixed random seed (42), Python 3.9+, TensorFlow 2.8.
-
Clone Repository:
git clone https://github.com/sahajrajmalla/dhumbal-ai.git cd dhumbal-ai -
Virtual Environment:
python -m venv venv source venv/bin/activate # Windows: venv\Scripts\activate
-
Install Dependencies:
pip install -r requirements.txt
-
Quick Demo:
python quick_demo.py
-
Tournaments:
- Rule-Based:
python agents/rule_based/rule_based_agent.py - Search-Based:
python agents/search_based/single_process_search.py - Learning-Based:
python agents/learning_based/compete.py - Championship:
python championship/final.py
- Rule-Based:
-
Visualizations:
cd visualization jupyter notebookRun
performance_metrics.ipynbandstatistical_analysis.ipynb. -
Training:
- DQN:
python agents/learning_based/dqn/dqn.py - PPO:
python agents/learning_based/ppo/ppo.py
- DQN:
dhumbal-ai/
├── agents/
│ ├── rule_based/ # Rule-based agents and results
│ ├── search_based/ # MCTS, ISMCTS, and results
│ ├── learning_based/ # DQN, PPO, training, and results
│ └── hybrid/ # Placeholder for hybrid agents
├── championship/ # Cross-category tournament scripts and results
├── visualization/ # Notebooks and plots for analysis
├── quick_demo.py # Demo script
├── README.md # This file
├── LICENSE # MIT License
└── .gitignore
Fork, create a branch, commit changes, and submit a pull request. Follow PEP 8 and document code.
MIT License (see LICENSE).
Sahaj Raj Malla: [email protected]