Skip to content

Commit 23f2060

Browse files
authored
Merge pull request #16 from SeQuenC-Consortium/release/27.08.23
Release/27.08.23
2 parents 8f4da89 + 4e113ae commit 23f2060

File tree

87 files changed

+2030
-1247
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

87 files changed

+2030
-1247
lines changed

.github/workflows/run-pytests.yml

Lines changed: 1 addition & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
name: Runs python tests
22

3-
on: [push]
3+
on: [ push ]
44

55
jobs:
66
run-unittests:
@@ -20,15 +20,5 @@ jobs:
2020
poetry-version: 1.4.0
2121
- name: Install requirements with poetry
2222
run: python -m poetry export --without-hashes --format=requirements.txt -o requirements.txt && python -m pip install -r requirements.txt
23-
- name: Setup redis
24-
uses: supercharge/[email protected]
25-
with:
26-
redis-version: 6
27-
- name: Install redis cli # so we can test the server
28-
run: sudo apt-get install -y redis-tools
29-
- name: Verify that redis is up
30-
run: redis-cli ping
31-
- name: Install pytest-mock
32-
run: python -m pip install pytest-mock
3323
- name: Run unit tests
3424
run: python -m pytest tests/automated_tests

Dockerfile

Lines changed: 3 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
FROM python:3.9
1+
FROM python:3.11
22

33
# install git and remove caches again in same layer
44
ARG DEBIAN_FRONTEND=noninteractive
@@ -23,12 +23,7 @@ ENV CELERY_WORKER_POOL=threads
2323
RUN mkdir --parents /app/instance \
2424
&& chown --recursive gunicorn /app && chmod --recursive u+rw /app/instance
2525

26-
# Wait for database
27-
ADD https://github.com/ufoscout/docker-compose-wait/releases/download/2.9.0/wait /wait
28-
RUN chmod +x /wait
29-
30-
31-
RUN python -m pip install poetry gunicorn invoke
26+
RUN python -m pip install poetry
3227

3328
COPY --chown=gunicorn . /app
3429

@@ -40,4 +35,4 @@ EXPOSE 5005
4035

4136
USER gunicorn
4237

43-
ENTRYPOINT ["python","-m", "invoke", "start-docker"]
38+
ENTRYPOINT ["python","-m", "invoke", "start-docker"]

README.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -66,6 +66,8 @@ celery worker and execute all tasks synchronously.
6666
* **PUT /devices/{device_id}/status** *(To check if a device is running)*
6767
* **PUT /devices/{device_id}/calibration** *(To get some device properties)*
6868

69+
### Run manually
70+
6971
Run the development server with
7072

7173
```bash
@@ -97,6 +99,18 @@ Trying out the tests -> See tests/README.md
9799
poetry run pytest .
98100
```
99101

102+
### Run using docker-compose
103+
104+
Execute the following command the deployment will be started using docker-compose. This will build the dockerimage
105+
containing the application and creates all required containers including the database and the message queue.
106+
107+
```bash
108+
docker-compose up -d
109+
docker-compose exec server python -m flask create-and-load-db
110+
```
111+
112+
![Architecture](docker-compose-architecture.svg)
113+
100114
### Trying out the Template
101115

102116
For a list of all dependencies with their license open <http://localhost:5005/licenses/>.

docker-compose-architecture.svg

Lines changed: 4 additions & 0 deletions
Loading

docker-compose.yaml

Lines changed: 63 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,63 @@
1+
version: "3"
2+
services:
3+
server:
4+
build: .
5+
image: qunicorn:local
6+
networks:
7+
- qunicorn
8+
environment:
9+
CONTAINER_MODE: server
10+
SERVER_PORT: 8080
11+
BROKER_URL: "redis://broker:6379"
12+
DB_URL: "postgresql://postgres:passwd@postgres/qunicorn"
13+
depends_on:
14+
- postgres
15+
- broker
16+
ports:
17+
- "8080:8080"
18+
labels:
19+
kompose.service.expose: "true"
20+
kompose.service.type: "loadbalancer"
21+
22+
worker:
23+
build: .
24+
image: qunicorn:local
25+
networks:
26+
- qunicorn
27+
environment:
28+
CONTAINER_MODE: worker
29+
BROKER_URL: "redis://broker:6379"
30+
DB_URL: "postgresql://postgres:passwd@postgres/qunicorn"
31+
depends_on:
32+
- postgres
33+
- broker
34+
ports:
35+
- "6379"
36+
37+
broker:
38+
image: redis:7.0.12
39+
networks:
40+
- qunicorn
41+
ports:
42+
- "6379:6379"
43+
44+
postgres:
45+
image: postgres:15.3
46+
networks:
47+
- qunicorn
48+
environment:
49+
PGDATA: /var/lib/postgresql/data/pgdata
50+
POSTGRES_PASSWORD: passwd
51+
POSTGRES_DB: qunicorn
52+
volumes:
53+
- pgdata:/var/lib/postgresql/data \
54+
ports:
55+
- "5432:5432"
56+
57+
58+
networks:
59+
qunicorn:
60+
61+
volumes:
62+
pgdata:
63+

docs/adr/0004-mapper.md

Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
# How to Map Dtos to Database Objects and wise versa
2+
3+
* Status: accepted
4+
5+
## Context and Problem Statement
6+
7+
We need multiple data transfer objects (DTOs) for different purposes.
8+
9+
* One for the requests
10+
* One for the internal logic
11+
* One for the database
12+
* One for the responses
13+
14+
Normally they need to map from request to core and then back to a response-dto.
15+
To save the updates to the database they need to be mapepd to the database object.
16+
To map all of these, we need a mapper.
17+
18+
## Decision Drivers <!-- optional -->
19+
20+
* Have all mapper at one place
21+
* Clean and readable code
22+
* Automatically map objects
23+
24+
## Considered Options
25+
26+
* object-mapper
27+
* py-automapper
28+
* map-struct
29+
30+
## Decision Outcome
31+
32+
py-automapper
33+
34+
## Description of Changes
35+
36+
A library was added. And used in the "core/mapper/" folder.
37+
38+
### Folder Structure
39+
40+
* core/mapper/
41+
* \__init\__.py: Contains all imports for the mapper
42+
* general_mapper.py: Contains a helper method which can be used to map objects with automapper
43+
* mapper.py: Multiple mapper, one for each model
44+
45+
### Naming Pattern
46+
47+
The naming pattern for the mapper-methods is as follows:
48+
If a dataclass object needs to be mapped to a dto the name would be dataclass_to_dto.
49+
If there exists multiple dtos for one dataclass model, the name does not anymore include "dto".
50+
For example for the job_request_dto the mapper name is: dataclass_to_request.
51+
52+
53+
<!-- markdownlint-disable-file MD013 -->

0 commit comments

Comments
 (0)