This guide covers three different setup approaches for running the e-footprint-interface application locally. Choose the one that best fits your development needs.
Best for: Running front-end tests quickly, rapid iteration, minimal setup
Runs everything locally (Django + SQLite) without Docker. This is the fastest option for E2E testing since there's no Docker overhead.
Pros:
- Fastest front-end test execution
- Minimal dependencies
- Quick startup/restart
- No Docker required
Cons:
- Uses SQLite instead of PostgreSQL (different from production)
- No debugging of PostgreSQL-specific issues
Jump to: Full Local Setup
Best for: Active development with IDE debugging and production-like database
Runs PostgreSQL in Docker while running Django locally. This gives you IDE debugging capabilities with a production-like database.
Pros:
- Full IDE debugging support
- Production-like PostgreSQL database
- Hot reload for code changes
- Fast iteration cycle
Cons:
- Slower front-end tests than full local (Docker overhead)
- Requires Docker for PostgreSQL
Jump to: Hybrid Local Setup
Best for: Testing in production-like environment, sharing consistent setups
Runs all services (Django, PostgreSQL, Traefik) in Docker containers. See the Docker Setup Guide for details.
Pros:
- Production-like environment
- Consistent across machines
- No local Python/Node setup needed
Cons:
- Slower iteration cycle
- Limited debugging capabilities
- Docker resource overhead
Jump to: Full Docker Setup
This setup runs Django with SQLite locally - no Docker required. Perfect for fast front-end test execution.
- Poetry: Follow instructions on python-poetry.org
- Node.js: Install via nvm
Follow the instructions on the official poetry website
poetry installDownload and install nvm (node version manager) from https://github.com/nvm-sh/nvm then install node:
nvm install nodeCheck installation:
node --version
npm --versionInstall JavaScript dependencies:
npm installBuild the result charts bundle (required):
npm run build:result-charts:devIf you edit files under theme/static/scripts/result_charts/, rebuild the bundle (or run a watcher):
npm run build:result-charts:dev -- --watchSQLite database will be created automatically:
poetry run python manage.py migrate
poetry run python manage.py createcachetable
poetry run python manage.py collectstaticpoetry run python manage.py createsuperuserStart Bootstrap CSS compilation (in one terminal):
npm run watchRun Django development server (in another terminal):
poetry run python manage.py runserver- Open Run → Edit Configurations...
- Add Django Server configuration:
- Name:
Django Local (SQLite) - Host:
0.0.0.0 - Port:
8000 - Environment variables: (leave empty - defaults to SQLite)
- Name:
- Run or Debug this configuration
- Set breakpoints in your code for interactive debugging!
Open your browser to: http://localhost:8000
This setup runs PostgreSQL in Docker while running Django locally in your IDE with full debugging support. This gives you a production-like database while maintaining fast development iteration.
- Docker Desktop: Install from docker.com
- Poetry: Follow instructions on python-poetry.org
- Node.js: Install via nvm
Follow the instructions on the official poetry website
poetry installDownload and install nvm (node version manager) from https://github.com/nvm-sh/nvm then install node:
nvm install nodeCheck installation:
node --version
npm --versionInstall JavaScript dependencies:
npm installBuild the result charts bundle (required):
npm run build:result-charts:devIf you edit files under theme/static/scripts/result_charts/, rebuild the bundle (or run a watcher):
npm run build:result-charts:dev -- --watchStart the PostgreSQL database container:
docker compose -f docker-compose.infra.yml up -dThis will start only the PostgreSQL database. Django will run locally on your machine.
Note: If you want to run the full production-like Docker environment (including Traefik reverse proxy), see Option 3 instead.
Create a .env.local file in the root directory:
cat > .env.local << EOF
DJANGO_DOCKER=False
DATABASE_URL=postgresql://root:kakoukakou@localhost:5432/efootprint
EOFpoetry run python manage.py migrate
poetry run python manage.py createcachetable
poetry run python manage.py collectstaticpoetry run python manage.py createsuperuserStart Bootstrap CSS compilation (in one terminal):
npm run watchRun Django development server (in another terminal):
poetry run python manage.py runserverStart Infrastructure:
- Open Run → Edit Configurations...
- Add Docker Compose configuration:
- Name:
Infrastructure (Postgres) - Compose file: Select
docker-compose.infra.yml
- Name:
- Run this configuration
Start Django with Debugging:
- Open Run → Edit Configurations...
- Add Django Server configuration:
- Name:
Django Local (Debug) - Host:
0.0.0.0 - Port:
8000 - Environment variables: Load from
.env.local
- Name:
- Run or Debug this configuration
- Set breakpoints in your code for interactive debugging!
Open your browser to: http://localhost:8000
Stop the database when not needed:
docker compose -f docker-compose.infra.yml downView logs:
docker compose -f docker-compose.infra.yml logs -fFor a complete production-like environment with all services running in Docker (Django, PostgreSQL, Traefik), see the Docker Setup Guide.
Note: the UI loads the result charts from a bundled file at theme/static/bundles/result_charts.js. When using the full Docker setup with the /app bind-mount, generate this bundle on your host before starting the stack:
npm install
npm run build:result-charts:devThis approach is ideal for:
- Testing in a production-like environment with HTTPS
- Consistent environment across different machines
- Quick setup without local Python (Node is still needed to build the result charts bundle when using the bind-mount)
- Integration testing with reverse proxy (Traefik)
Access after setup: https://efootprint.boavizta.dev
For fastest front-end test execution during development, use Option 1 (Full Local Development) with SQLite. This eliminates Docker overhead and provides the quickest test runs. Then before deploying to prod, run tests in Option 3 (Full Docker Environment) to ensure compatibility with PostgreSQL.
To switch to full local mode for testing:
- Stop any running Docker containers:
docker compose -f docker-compose.infra.yml down - Remove or rename
.env.localif it exists - Run migrations:
poetry run python manage.py migrate && poetry run python manage.py createcachetable - Run tests as shown below
poetry run python manage.py testPlaywright tests use Python and integrate with pytest. They can share fixtures with unit tests and create test models programmatically.
First-time setup (after poetry install):
# Install Playwright browsers
poetry run playwright install chromiumRunning E2E tests requires the Django server to be running:
# Terminal 1: Start the Django server
poetry run python manage.py runserver
# Terminal 2: Run E2E tests
poetry run pytest tests/e2e/Run tests with visible browser (headed mode):
poetry run pytest tests/e2e/ --headedRun a specific test with debugger:
poetry run pytest tests/e2e/test_forms.py::test_unsaved_changes -s --pdbRun tests in parallel:
# Run E2E tests in parallel (4 workers)
poetry run pytest tests/e2e/ -n 4 --base-url http://localhost:8000Generate test code (record mode):
poetry run playwright codegen http://localhost:8000/model_builder/Run with different browsers:
poetry run pytest tests/e2e/ --browser firefox
poetry run pytest tests/e2e/ --browser webkitOverride the server URL (e.g., for Docker):
poetry run pytest tests/e2e/ --headed --base-url https://efootprint.boavizta.devnpm install jest --global
jestYou can easily switch between the three setup options:
-
Start PostgreSQL container:
docker compose -f docker-compose.infra.yml up -d
-
Create
.env.local:cat > .env.local << EOF DJANGO_DOCKER=False DATABASE_URL=postgresql://root:kakoukakou@localhost:5432/efootprint EOF
-
Run migrations:
poetry run python manage.py migrate poetry run python manage.py createcachetable
-
Stop PostgreSQL container:
docker compose -f docker-compose.infra.yml down
-
Remove
.env.local:rm .env.local
-
SQLite will be used automatically (no additional configuration needed)
See the Docker Setup Guide for complete instructions.