This is the new Training Metrics Database of the ELIXIR Training Platform.
The Training Metrics Database was developed in an effort to streamline data collection, storage, and visualisation for the Quality and Impact Subtask of the ELIXIR Training Platform, which aims to collect information from training events offered by ELIXIR Nodes, ideally all captured in TeSS:
- Describe the audience demographic
- Assess the quality directly after they have taken place,
- Evaluate the longer term impact that these events have had on the work of training participants.
In an effort to achieve the above aims, the subtask, in collaboration with the ELIXIR Training Coordinators, has compiled a set of core metrics for measuring audience demographics and training quality, in the short term, and training impact, in the longer term. Both sets of metrics are collected via feedback survey. In some cases, the demographic information is collected via registration form. These metrics were developed out of those already collected by ELIXIR training providers as well as from discussions with stakeholders and external training providers.
The most up to date documentation is available in Wiki.
Make sure to create instances of the following files:
env/django.envenv/metabase.envenv/postgres.env
The necessary parameters are covered in the respective env/*.default files.
Make sure to set DJANGO_PRODUCTION=0 in env/django.env.
docker compose -f docker-compose.yml -f docker-compose.dev.yml up --buildSeed the database with test data:
docker compose exec tmd-dj python manage.py load_data
# NOTE: Currently the load_data command loads the metric data into a legacy model that
# is no longer exposed by default in the UI. In order to get the data in the right place
# run the following command to migrate the metrics data:
docker compose exec tmd-dj ./manage.py migrate_metricsA super user is helpful if you want to manage your data and try adding other users. A super user can be created as follows:
docker compose exec tmd-dj python manage.py createsuperuserdocker compose up --buildSeed the database with production data:
docker compose up --build
# We do it this way in order to avoid having the seed data mounted in the production environment
docker compose run --volume "/$(pwd)/raw-tmd-data:/opt/tmd/app/raw-tmd-data:ro" --entrypoint "python manage.py load_data" tmd-djpre-commit run -a