Skip to content

Commit bdeb0dc

Browse files
committed
Add task 2 and 3 to chapter 7
1 parent 63b68d1 commit bdeb0dc

File tree

7 files changed

+379
-30
lines changed

7 files changed

+379
-30
lines changed

chapter_7/README.md

Lines changed: 161 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -1,47 +1,184 @@
11
# Chapter 7 - Train Logistics™
22

3-
The dreaded day has arrived. Your manager walked into the landscape today looking lost and confused, he sits on the 14th
4-
floor of course, and pulled you aside to introduce this brand-new application requested by the conductors that will
5-
solve all their problems. He then went into how important it would be for the company, how it would be a
6-
game-changer for the industry and how fortunate you were that had been hand-picked to implement this project. When you
7-
asked about the increased responsibility and effects on your salary, he merely laughed and went back to his office on
8-
the 14th floor.
3+
The dreaded day has arrived. Your manager walked into the landscape today looking lost and confused, he sits on the 14th floor of course, and pulled you aside to introduce this brand-new application requested by the conductors that will solve all their problems. He then went into how important it would be for the company, how it would be a game-changer for the industry and how fortunate you were that had been hand-picked to implement this project. When you asked about the increased responsibility and effects on your salary, he merely laughed and went back to his office on the 14th floor.
94

10-
While you don't get a salary increase you have a certain professional integrity that you want to uphold, not to
11-
mention an overshadowing fear of the conductors. But you have prepared for this moment and decide to implement some
12-
basics and immediately get an integration test running!
5+
While you don't get a salary increase you have a certain professional integrity that you want to uphold, not to mention an overshadowing fear of the conductors. But you have prepared for this moment and decide to implement some basics and immediately get an integration test running!
136

147
## Setup
158

16-
Same as always, create a new virtual environment for this chapter. Ensure the new environment is activated in
17-
your active terminal.
9+
Same as always, create a new virtual environment for this chapter. Ensure the new environment is activated in your active terminal.
1810

1911
Install dependencies and the application itself.
2012

2113
```
2214
pip install ".[dev]"
2315
```
2416

25-
## Task 1: The Train Logistics™ application
17+
## Task 1: The Train Logistics application
2618

27-
The application is another API which manages the logistics related operations of the trains. How much food is available
28-
in the train for instance? The Train Logistics™ will be an amazing application which monitors all of this.
19+
The application is another API which manages the logistics related operations of the trains. How much food is available in the train for instance? The Train Logistics will be an amazing application which monitors all of this.
20+
21+
Before writing a single line of test code, take a moment to understand what you are dealing with. The Train Logistics API lives in its own remote repository and is published as a Docker image at
22+
`ghcr.io/equinor/train-logistics`, exactly as the Tickets API was in chapter 6.
23+
24+
Two files have been pre-provided for you this chapter. Read through both before continuing:
25+
26+
- [custom_containers/azurite.py](integration_tests_ch7/custom_containers/azurite.py) — helper functions for spinning up an Azurite container and seeding blob storage.
27+
- [custom_containers/train_logistics.py](integration_tests_ch7/custom_containers/train_logistics.py) — the `TrainLogisticsAPI` wrapper class and factory/wait functions for the container.
28+
29+
The Train Logistics API is not a standalone service — it depends on both a postgres database **and** an Azure Blob Storage account. The conductors insisted on JSON files in blob storage. We already told you this was insane.
30+
31+
Your task for now is simply to understand what the application needs and how the two pre-provided files address those needs. Answer the following questions before moving on:
32+
33+
- What environment variables does the Train Logistics container require?
34+
- What port does it expose?
35+
- Why are there two different connection strings built in `azurite.py` — a docker one and a host one? When is each used?
36+
- What does `ensure_blob_containers()` do, and why must it be called before the container starts?
2937

3038
## Task 2: The storage solution
3139

32-
While your manager said "brand-new" it turns out you need to work with some legacy systems. The current system
33-
uses JSON files which are stored in an Azure storage account to monitor the amount of available resources on the train.
34-
The conductors have been very explicit in that they like this approach, it's so easy to just edit the files. Why would
35-
you do anything else?
40+
While your manager said "brand-new" it turns out you need to work with some legacy systems. The current system uses JSON files which are stored in an Azure storage account to monitor the amount of available resources on the train.
41+
The conductors have been very explicit in that they like this approach, it's so easy to just edit the files. Why would you do anything else?
42+
43+
While clearly insane, you conclude that you need to support this JSON data input style for now. The application will have to talk to Azure storage blobs. How can you keep going with the integration tests while having to interact with a Microsoft service?
44+
45+
Fortunately, Microsoft has provided us with a gift: [Azurite](https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azurite), a local emulator for Azure Blob Storage. It runs as a Docker container which is exactly what we need.
46+
47+
The Azurite helper functions are already prepared for you in
48+
[custom_containers/azurite.py](integration_tests_ch7/custom_containers/azurite.py). Take a moment to read through the file and understand the three helpers:
49+
50+
- `create_azurite_container()` — spins up the Azurite image bound on `0.0.0.0` so Docker can expose its ports to the
51+
host. It accepts a `name` which doubles as the network alias.
52+
- `azurite_connection_string_for_containers()` — builds the connection string. You will call this twice: once with the
53+
container alias (for container-to-container traffic on the Docker network) and once with `localhost` (for the host
54+
machine to seed data into the storage from within the fixture).
55+
- `ensure_blob_containers()` — uses the host-side connection string to pre-create the blob containers before the Train
56+
Logistics API starts looking for them.
3657

37-
While clearly insane, you conclude that you need to support this JSON data input style for now. The application will
38-
have to talk to Azure storage blobs. How can you keep going with the integration tests while having to interact with a
39-
Microsoft service?
58+
Your task is to implement a `train_logistics_storage` fixture in
59+
[conftest.py](integration_tests_ch7/conftest.py). The skeleton below shows where to start.
4060

41-
TODO: Setup azurite container in the tests
61+
```python
62+
# conftest.py
63+
@pytest.fixture
64+
def train_logistics_storage(network: Network) -> Generator[TrainLogisticsStorage, None, None]:
65+
azurite_container_name = "train-logistics"
66+
AZURITE_ACCOUNT: str = "devstoreaccount1"
67+
AZURITE_KEY: str = "Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="
68+
69+
with create_azurite_container(network=network, name=azurite_container_name) as container:
70+
# 1. Wait for the port mapping to be available (port 10000)
71+
# 2. Build docker_connection_string using azurite_container_name as host
72+
# 3. Build host_connection_string using "localhost" and the exposed port
73+
# 4. Call ensure_blob_containers with the host connection string
74+
# 5. Yield a TrainLogisticsStorage with an AzuriteStorageContainer inside
75+
...
76+
```
77+
78+
> **Note on the Azurite credentials:** the account name and key above are the publicly documented defaults shipped with
79+
> every Azurite installation. They are not secrets — you will find them in the official Microsoft documentation and in
80+
> every tutorial on the internet. Do not lose sleep over them.
81+
82+
### The most important thing you will read in this chapter
83+
84+
Keep `yield` **inside** the `with` block. If you place it outside, the context manager exits before your test runs, the Azurite container is destroyed, the Train Logistics API cannot connect to storage, and you will spend a very pleasant afternoon watching a retry loop print the same warning over and over again. You have been warned.
4285

4386
## Task 3: Proper integration testing
4487

45-
TODO: Create an integration test which includes Azurite, Tickets API, Train Logistics™ and postgres.
88+
With Azurite in place you now have all four pillars ready to assemble into a proper multi-container integration test.
89+
90+
| Container | Purpose |
91+
|-----------------------|-----------------------------------------------|
92+
| `postgres` | Shared database for both APIs |
93+
| `tickets_api` | The original ticketing application |
94+
| `train_logistics_api` | The new logistics application |
95+
| `azurite` | Azure Blob Storage emulator for JSON manifests |
96+
97+
98+
Image: ![img.png](img.png)
99+
100+
### Task 3a: The Train Logistics™ fixture
101+
102+
The Train Logistics application lives in a remote repository and is published as a Docker image at
103+
`ghcr.io/equinor/train-logistics:latest`, exactly like the Tickets API in chapter 6.
104+
105+
The helper functions `create_train_logistics_api_container()` and `wait_for_train_logistics_api_to_be_ready()` are already implemented for you in
106+
[custom_containers/train_logistics.py](integration_tests_ch7/custom_containers/train_logistics.py). Read through them.
107+
108+
The container exposes port `3001` and requires two environment variables which the factory function already wires up for you:
109+
- `TRAIN_LOGISTICS_DATABASE_URL` — the postgres connection string using the `postgres` network alias.
110+
- `AZURE_STORAGE_CONNECTION_STRING` — the Azurite connection string, **docker-side** (using the container alias, not
111+
`localhost`).
112+
113+
Your task is to add the `train_logistics_api` fixture in [conftest.py](integration_tests_ch7/conftest.py):
114+
115+
```python
116+
# conftest.py
117+
@pytest.fixture
118+
def train_logistics_api(
119+
network: Network,
120+
postgres_database: PostgresDatabase,
121+
train_logistics_storage: TrainLogisticsStorage,
122+
) -> Generator[TrainLogisticsAPI]:
123+
# 1. Get the docker-side connection string for Azurite from train_logistics_storage
124+
# 2. Start the container using create_train_logistics_api_container()
125+
# 3. Wait for port 3001 mapping, build backend_url
126+
# 4. Wait for the API to be ready using wait_for_train_logistics_api_to_be_ready()
127+
# 5. Yield a TrainLogisticsAPI object
128+
...
129+
```
130+
131+
The `TrainLogisticsAPI` wrapper class (already defined in `train_logistics.py`) expects: `container`, `backend_url`, `name`, `port`, and `alias`.
132+
133+
### Task 3b: Write the integration tests
134+
135+
Two tests need to be added to [test_integration.py](integration_tests_ch7/test_integration.py).
136+
137+
**`test_buy_ticket`** — a familiar face from chapter 5. Make a `POST` request to `/tickets/buy` on the `tickets_api` container and verify the response body contains the expected fields.
138+
139+
```python
140+
@pytest.mark.parametrize(
141+
"train_code,passenger_name,seat_number",
142+
[
143+
("The Orient Express", "Leonardo DaVinci", 14),
144+
("Bergensbanen", "Jonas Gahr Støre", 1),
145+
("Raumabanen", "Kong Harald", None),
146+
],
147+
)
148+
def test_buy_ticket(
149+
tickets_api: TicketsAPI,
150+
train_code: str,
151+
passenger_name: str,
152+
seat_number: str | int,
153+
) -> None:
154+
...
155+
```
156+
157+
**`test_check_stock`** — the first test written against the Train Logistics™ API. Make a `POST` request to `/logistics/check-stock` with a `train_code` and `product` and assert the response contains the correct stock level.
158+
This endpoint returns `200 OK` — it queries stock, it does not create anything. Do not assert `201 CREATED` unless you enjoy failing tests.
159+
160+
```python
161+
@pytest.mark.parametrize(
162+
"train_code,product,expected_in_stock",
163+
[
164+
("The Orient Express", "banana", 10),
165+
("Bergensbanen", "apple", 5),
166+
("Raumabanen", "orange", 0),
167+
],
168+
)
169+
def test_check_stock(
170+
train_logistics_api: TrainLogisticsAPI,
171+
train_code: str,
172+
product: str,
173+
expected_in_stock: int,
174+
) -> None:
175+
...
176+
```
177+
178+
Run all tests with:
179+
180+
```bash
181+
pytest -s .
182+
```
46183

47-
Image: ![img.png](img.png)
184+
If all four containers start, Azurite seeds correctly, both APIs accept requests and all assertions pass — congratulations. You have just run a four-container integration test from a single `pytest` command. No manual setup, no shared cloud bill, no angry conductor breathing down your neck.

chapter_7/integration_tests_ch7/conftest.py

Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,18 @@
77
import pytest
88
from testcontainers.core.network import Network
99

10+
from chapter_7.integration_tests_ch7.custom_containers.azurite import (
11+
AzuriteStorageContainer,
12+
TrainLogisticsStorage,
13+
azurite_connection_string_for_containers,
14+
create_azurite_container,
15+
ensure_blob_containers,
16+
)
17+
from chapter_7.integration_tests_ch7.custom_containers.train_logistics import (
18+
TrainLogisticsAPI,
19+
create_train_logistics_api_container,
20+
wait_for_train_logistics_api_to_be_ready,
21+
)
1022
from integration_tests_ch7.custom_containers.postgres import (
1123
PostgresDatabase,
1224
create_postgres_container,
@@ -17,6 +29,12 @@
1729
wait_for_tickets_api_to_be_ready,
1830
)
1931

32+
AZURITE_ACCOUNT: str = "devstoreaccount1"
33+
# Default Azurite development key — not a secret
34+
AZURITE_KEY: str = (
35+
"Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="
36+
)
37+
2038

2139
@pytest.fixture
2240
def network():
@@ -78,3 +96,19 @@ def wait_for_port_mapping_to_be_available(
7896
raise ConnectionError(
7997
f"Port mapping for container {container.image} on port {port} not available within timeout"
8098
)
99+
100+
101+
@pytest.fixture
102+
def train_logistics_storage(
103+
network: Network,
104+
) -> Generator[TrainLogisticsStorage, None, None]:
105+
raise NotImplementedError
106+
107+
108+
@pytest.fixture
109+
def train_logistics_api(
110+
network: Network,
111+
postgres_database: PostgresDatabase,
112+
train_logistics_storage: TrainLogisticsStorage,
113+
) -> Generator[TrainLogisticsAPI]:
114+
raise NotImplementedError
Lines changed: 69 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,69 @@
1+
from typing import Dict
2+
3+
from azure.core.exceptions import ResourceExistsError
4+
from azure.storage.blob import BlobServiceClient
5+
from docker.models.networks import Network
6+
7+
from chapter_7.integration_tests_ch7.custom_containers.log_docker_container import (
8+
LogDockerContainer,
9+
)
10+
11+
12+
class AzuriteStorageContainer:
13+
14+
def __init__(
15+
self,
16+
alias: str,
17+
container: LogDockerContainer,
18+
docker_connection_string: str,
19+
host_connection_string: str,
20+
):
21+
self.alias = alias
22+
self.container = container
23+
self.docker_connection_string = docker_connection_string
24+
self.host_connection_string = host_connection_string
25+
26+
27+
class TrainLogisticsStorage:
28+
def __init__(self, azurite_containers: Dict[str, AzuriteStorageContainer]) -> None:
29+
self.azurite_containers: Dict[str, AzuriteStorageContainer] = azurite_containers
30+
31+
32+
def create_azurite_container(
33+
network: Network, name: str = "azurite"
34+
) -> LogDockerContainer:
35+
# Command binds services to 0.0.0.0 so Docker can map ports
36+
cmd: str = (
37+
"azurite --blobHost 0.0.0.0 --queueHost 0.0.0.0 --tableHost 0.0.0.0 --skipApiVersionCheck"
38+
)
39+
40+
AZURITE_IMAGE: str = "mcr.microsoft.com/azure-storage/azurite:latest"
41+
42+
container: LogDockerContainer = (
43+
LogDockerContainer(image=AZURITE_IMAGE, command=cmd)
44+
.with_name(name)
45+
.with_network(network)
46+
.with_network_aliases(name)
47+
.with_exposed_ports(10000)
48+
)
49+
return container
50+
51+
52+
def azurite_connection_string_for_containers(
53+
azurite_account: str, azurite_key: str, azurite_alias: str, port: int
54+
) -> str:
55+
return (
56+
"DefaultEndpointsProtocol=http;"
57+
f"AccountName={azurite_account};"
58+
f"AccountKey={azurite_key};"
59+
f"BlobEndpoint=http://{azurite_alias}:{port}/{azurite_account};"
60+
)
61+
62+
63+
def ensure_blob_containers(connection_string: str, *names: str) -> None:
64+
svc: BlobServiceClient = BlobServiceClient.from_connection_string(connection_string)
65+
for name in names:
66+
try:
67+
svc.create_container(name)
68+
except ResourceExistsError:
69+
pass

chapter_7/integration_tests_ch7/custom_containers/log_docker_container.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ def __init__(
2424
self._stop_logs: Event = Event()
2525
self._logging_thread: Optional[Thread] = None
2626

27-
def start(self) -> LogDockerContainer:
27+
def start(self) -> "LogDockerContainer":
2828
super().start()
2929

3030
self._stop_logs.clear()

0 commit comments

Comments
 (0)