Open city profile is used to store common information (name, contact information, ...) about the citizens of the city of Helsinki.
When a citizen is using a service which is connected to the profile, the service can query for the citizen's information from the profile so that the citizen doesn't have to enter all of their data every time it is needed. The services may also provide a better user experience using the profile's data, for example by returning more relevant search results based on the citizen's interests.
The same data may also be queried by the employees of the city of Helsinki while performing their daily duties, for example using the administrative functions of services.
Open city profile is implemented using Django and it provides a GraphQL API.
See docs/config.adoc.
Development with Docker
Prerequisites:
- Docker engine: 18.06.0+
- Docker compose 1.22.0+
-
Create a
compose.envfile in the project folder:- Use
compose.env.exampleas a base, it does not need any changes for getting the project running. - Change
DEBUGand the rest of the Django settings if needed.TOKEN_AUTH_*, settings for authentication service
- Set entrypoint/startup variables according to taste.
CREATE_SUPERUSER, creates a superuser with credentialsadmin:admin(admin@example.com)APPLY_MIGRATIONS, applies migrations on startupENABLE_GRAPHIQL, enables GraphiQL interface for/graphql/ENABLE_GRAPHQL_INTROSPECTION, enables GraphQL introspection queriesSEED_DEVELOPMENT_DATA, flush data and recreate the environment with fake development data (requiresAPPLY_MIGRATIONS)KEYCLOAK_BASE_URL, the base URL of the Keycloak server, including any configured context path.KEYCLOAK_REALM, the name of the Keycloak realm to use.KEYCLOAK_CLIENT_ID, authentication to the Keycloak instance happens using a service account. This is the client id.KEYCLOAK_CLIENT_SECRET, ...and this is the client secret.KEYCLOAK_GDPR_CLIENT_ID, client id to use in the authorization code flow for GDPR calls.KEYCLOAK_GDPR_CLIENT_SECRET, client secret to use in the authorization code flow for GDPR calls.GDPR_AUTH_CALLBACK_URL, GDPR auth callback URL should be the same which is used by the UI for fetching OAuth/OIDC authorization token for using the GDPR API
- Use
-
Run
docker compose up- The project is now running at localhost:8080
Optional steps
-
Run migrations:
- Taken care by the example env
docker exec profile-backend python manage.py migrate
-
Seed development data
- Taken care by the example env
- See also Seed development data below
docker exec profile-backend python manage.py seed_development_data
-
Create superuser:
- Taken care by the example env
docker exec -it profile-backend python manage.py createsuperuser
-
Set permissions for service staff members if needed:
- Create group(s) (via Django admin) and add user(s) to the group
- Create service permissions for group manually via Django admin or for example:
docker exec profile-backend python manage.py add_object_permission ServiceName GroupName can_view_profiles, where:ServiceNameis the name of the Service the permission is given forGroupNameis the name of the group to whom the permission is givecan_view_profilesis the name of the permission
- Permissions can be removed as follows:
docker exec profile-backend python manage.py remove_object_permission ServiceName GroupName can_view_profiles
-
Seed development data
- Note! This command will flush the database.
- Add all data with defaults:
docker exec profile-backend python manage.py seed_development_data - See
python manage.py help seed_development_datafor optional arguments - Command will generate:
- All available services
- One group per service (with
can_manage_profilespermissions) - One user per group (with username
{group.name}_user) - Profiles
- With user
- With email, phone number and address
- Connects to one random service
Prerequisites:
- PostgreSQL 17
- Python 3.12
- Run
pip install -r requirements.txt - Run
pip install -r requirements-dev.txt(development requirements)
To setup a database compatible with default database settings:
Create user and database
sudo -u postgres createuser -P -R -S open_city_profile # use password `open_city_profile`
sudo -u postgres createdb -O open_city_profile open_city_profile
Allow user to create test database
sudo -u postgres psql -c "ALTER USER open_city_profile CREATEDB;"
- Create
.envfile:touch .env - Set the
DEBUGenvironment variable to1. - Run
python manage.py migrate - Run
python manage.py createsuperuser - Run
python manage.py runserver 0:8000
The project is now running at localhost:8000
This repository contains requirements*.in and corresponding requirements*.txt files for requirements handling. The requirements*.txt files are generated from the requirements*.in files with pip-compile.
-
Add new packages to
requirements.inorrequirements-dev.in -
Update
.txtfile for the changed requirements file:pip-compile requirements.inpip-compile requirements-dev.in- Note: the
requirements*.txtfiles added to version control are meant to be used in the containerized environment where the service is run. Because Python package dependencies are environment dependent they need to be generated within a similar environment. This can be done by running thepip-compilecommand within Docker, for example like this:docker compose exec django pip-compile requirements.in(the container needs to be running beforehand).
-
If you want to update dependencies to their newest versions, run:
pip-compile --upgrade requirements.in
-
To install Python requirements run:
pip-sync requirements.txt
Note: when updating dependencies, read the dependency update checklist if there's anything you need to pay attention to.
This project uses Ruff for code formatting and quality checking.
Basic ruff commands:
- lint:
ruff check - apply safe lint fixes:
ruff check --fix - check formatting:
ruff format --check - format:
ruff format
pre-commit can be used to install and
run all the formatting tools as git hooks automatically before a
commit.
New commit messages must adhere to the Conventional Commits specification, and line length is limited to 72 characters.
When pre-commit is in use, commitlint
checks new commit messages for the correct format.
The tests require a Postgres database to which to connect to. Here's one way to run the tests:
- Bring the service up with
docker compose up. This also brings up the required Postgres server. - Run tests within the Django container:
docker compose exec django pytest.
- Dev: https://profile-api.dev.hel.ninja//graphql/
- Test: https://profile-api.test.hel.ninja//graphql/
- Staging: https://api.hel.fi/profiili-stage/graphql/
- Production: https://api.hel.fi/profiili/graphql/
For a complete service the following additional components are also required:
- Keycloak is used as the authentication service
- open-city-profile-ui provides UI