Contest challenges for GDG Algiers — GCPC 2026.
| Label | Name | Author | Difficulty | Status |
|---|---|---|---|---|
| A | Dungeon Escape | Lyes Boudjabout | Easy | Published |
| B | Festival Queue Merges | Lyes Boudjabout | Easy | Published |
| C | Relief Distribution | Lyes Boudjabout | Easy | Published |
| D | Latency Budget | Bouzara Zakaria | Easy | Published |
| E | Arduino Breadboard Setup | Raouf Ould Ali | Medium | Published |
| F | Circuit Breaker | Bouzara Zakaria | Medium | Published |
| G | Cloud Battle | Redhouane Abdellah | Medium | Published |
| H | Emergency Lane | tarek-ait | Medium | Published |
| I | Permutation Riddle | Redhouane Abdellah | Medium | Published |
| J | The Sarrus Oracle | tarek-ait | Medium | Published |
| K | Rassim Sort | Redhouane Abdellah | Medium | Published |
| L | Midnight Relay Tour | tarek-ait | Medium | Published |
| M | DzNet Signal Coverage | Nabil Ghemam Djeridi | Medium | Published |
| N | AI Simulation on Uncle Island | Firas Mohamed Elamine KiNram | Hard | Published |
| O | Hoggar Trail | Firas Mohamed Elamine Kiram | Hard | Published |
| P | Card Tricks | Raouf Ould Ali | Hard | Published |
| Q | Quantum Pyramid | Raouf Ould Ali | Hard | Published |
Status lifecycle:
draft→ready→tested→publishedUpdate this table andcontest.yamltogether when adding/changing a problem.
- Copy
challenge_template/→challenges/<problem-id>/- Use a short lowercase slug as the folder name, e.g.
roberyorallo-nokia - This folder name becomes the problem's external ID in DOMjudge
- Use a short lowercase slug as the folder name, e.g.
- Generate a UUID and put it in
problem.yaml:python3 -c "import uuid; print(uuid.uuid4())" - Fill in
problem.yaml(name, time_limit) anddomjudge-problem.ini(color) - Write the statement in
statement/problem.md - Add test cases to
data/sample/anddata/secret/as.in/.anspairs - Replace the skeleton solutions in
submissions/accepted/with real solutions in all 6 languages (c, cpp, py, go, js, java) - Write the editorial in
editorial/editorial.md - Add an entry to
contest.yamland update the table above
| Folder | Visible to contestants? | Purpose |
|---|---|---|
data/sample/ |
Yes — shown in problem statement | The same examples printed in the PDF. Contestants can run these themselves. |
data/secret/ |
No — hidden during contest | The real judging test cases. Cover edge cases, large inputs, stress tests, etc. |
DOMjudge imports test cases by scanning for *.in / *.ans pairs.
Do not use .txt — DOMjudge ignores files that don't match these extensions.
data/
├── sample/
│ ├── 1.in ← sample test case 1 input
│ └── 1.ans ← sample test case 1 expected output
└── secret/
├── 01.in ← zero-padded so they sort correctly
├── 01.ans
├── 01.desc ← one-liner describing what this case tests (for authors only)
├── 02.in
├── 02.ans
└── 02.desc
Optional one-line description of what a secret test case tests. DOMjudge stores this and shows it to the jury. Examples:
edge case: n=1
maximum input: n=100000, all elements equal
random large input
adversarial case: strictly decreasing sequence
- No Windows line endings (
\r\n) — use Unix line endings (\n) only - Both
.inand.ansmust end with a newline - Sample test cases must exactly match the examples printed in
statement/problem.md
When a submission is judged, DOMjudge returns one of these verdicts:
| Verdict | Meaning |
|---|---|
| AC — Accepted | Output matches .ans for all test cases within time and memory limits |
| WA — Wrong Answer | Output doesn't match .ans on at least one test case |
| TLE — Time Limit Exceeded | Solution ran longer than timelimit seconds on at least one case |
| MLE — Memory Limit Exceeded | Solution used more memory than the memory limit |
| RTE — Run-Time Error | Solution crashed (segfault, exception, non-zero exit code) |
| CE — Compile Error | Solution failed to compile |
| NO — No Output | Solution produced empty output |
All accepted solutions must be provided in all 6 languages:
| Language | Extension | Notes |
|---|---|---|
| C | .c |
gcc, C17 |
| C++ | .cpp |
g++, C++17 |
| Java | .java |
OpenJDK — class name must match filename |
| Python 3 | .py |
CPython 3 |
| Go | .go |
Go 1.20 (custom-installed on judgehosts) |
| JavaScript | .js |
Node.js |
Run from inside the problem directory:
cd challenges/<problem-id>
# pre-compiled binary
../../scripts/solution_test.sh ./sol
# auto-compiled/interpreted source
../../scripts/solution_test.sh submissions/accepted/sol.cpp
../../scripts/solution_test.sh submissions/accepted/sol.py
../../scripts/solution_test.sh submissions/accepted/sol.go
../../scripts/solution_test.sh submissions/accepted/sol.javaThe script tests all cases in data/sample/ first, then data/secret/, and reports AC / WA / TLE per case. Exits non-zero if any case fails.
Authors write Markdown in statement/problem.md. Run from inside the challenge directory:
cd challenges/<problem-id>
../../scripts/render-pdf.shInstall once (Linux):
sudo apt install pandoc texlive-xetex texlive-fonts-recommended texlive-latex-extramacOS:
brew install pandoc && brew install --cask basictex
sudo tlmgr update --self && sudo tlmgr install collection-xetex collection-latexextraWindows: install Pandoc + MiKTeX (auto-downloads missing LaTeX packages on first run).
No local install needed? Push your
.md— the GitHub Actions workflow renders the PDF automatically and uploads it as a downloadable artifact.