|
1 | 1 | # Chapter 7 - Train Logistics™ |
2 | 2 |
|
3 | | -The dreaded day has arrived. Your manager walked into the landscape today looking lost and confused, he sits on the 14th |
4 | | -floor of course, and pulled you aside to introduce this brand-new application requested by the conductors that will |
5 | | -solve all their problems. He then went into how important it would be for the company, how it would be a |
6 | | -game-changer for the industry and how fortunate you were that had been hand-picked to implement this project. When you |
7 | | -asked about the increased responsibility and effects on your salary, he merely laughed and went back to his office on |
8 | | -the 14th floor. |
| 3 | +The dreaded day has arrived. Your manager walked into the landscape today looking lost and confused, he sits on the 14th floor of course, and pulled you aside to introduce this brand-new application requested by the conductors that will solve all their problems. He then went into how important it would be for the company, how it would be a game-changer for the industry and how fortunate you were that had been hand-picked to implement this project. When you asked about the increased responsibility and effects on your salary, he merely laughed and went back to his office on the 14th floor. |
9 | 4 |
|
10 | | -While you don't get a salary increase you have a certain professional integrity that you want to uphold, not to |
11 | | -mention an overshadowing fear of the conductors. But you have prepared for this moment and decide to implement some |
12 | | -basics and immediately get an integration test running! |
| 5 | +While you don't get a salary increase you have a certain professional integrity that you want to uphold, not to mention an overshadowing fear of the conductors. But you have prepared for this moment and decide to implement some basics and immediately get an integration test running! |
13 | 6 |
|
14 | 7 | ## Setup |
15 | 8 |
|
16 | | -Same as always, create a new virtual environment for this chapter. Ensure the new environment is activated in |
17 | | -your active terminal. |
| 9 | +Same as always, create a new virtual environment for this chapter. Ensure the new environment is activated in your active terminal. |
18 | 10 |
|
19 | 11 | Install dependencies and the application itself. |
20 | 12 |
|
21 | 13 | ``` |
22 | 14 | pip install ".[dev]" |
23 | 15 | ``` |
24 | 16 |
|
25 | | -## Task 1: The Train Logistics™ application |
| 17 | +## Task 1: The Train Logistics application |
26 | 18 |
|
27 | | -The application is another API which manages the logistics related operations of the trains. How much food is available |
28 | | -in the train for instance? The Train Logistics™ will be an amazing application which monitors all of this. |
| 19 | +The application is another API which manages the logistics related operations of the trains. How much food is available in the train for instance? The Train Logistics will be an amazing application which monitors all of this. |
| 20 | + |
| 21 | +Before writing a single line of test code, take a moment to understand what you are dealing with. The Train Logistics API lives in its own remote repository and is published as a Docker image at |
| 22 | +`ghcr.io/equinor/train-logistics`, exactly as the Tickets API was in chapter 6. |
| 23 | + |
| 24 | +Two files have been pre-provided for you this chapter. Read through both before continuing: |
| 25 | + |
| 26 | +- [custom_containers/azurite.py](integration_tests_ch7/custom_containers/azurite.py) — helper functions for spinning up an Azurite container and seeding blob storage. |
| 27 | +- [custom_containers/train_logistics.py](integration_tests_ch7/custom_containers/train_logistics.py) — the `TrainLogisticsAPI` wrapper class and factory/wait functions for the container. |
| 28 | + |
| 29 | +The Train Logistics API is not a standalone service — it depends on both a postgres database **and** an Azure Blob Storage account. The conductors insisted on JSON files in blob storage. We already told you this was insane. |
| 30 | + |
| 31 | +Your task for now is simply to understand what the application needs and how the two pre-provided files address those needs. Answer the following questions before moving on: |
| 32 | + |
| 33 | +- What environment variables does the Train Logistics container require? |
| 34 | +- What port does it expose? |
| 35 | +- Why are there two different connection strings built in `azurite.py` — a docker one and a host one? When is each used? |
| 36 | +- What does `ensure_blob_containers()` do, and why must it be called before the container starts? |
29 | 37 |
|
30 | 38 | ## Task 2: The storage solution |
31 | 39 |
|
32 | | -While your manager said "brand-new" it turns out you need to work with some legacy systems. The current system |
33 | | -uses JSON files which are stored in an Azure storage account to monitor the amount of available resources on the train. |
34 | | -The conductors have been very explicit in that they like this approach, it's so easy to just edit the files. Why would |
35 | | -you do anything else? |
| 40 | +While your manager said "brand-new" it turns out you need to work with some legacy systems. The current system uses JSON files which are stored in an Azure storage account to monitor the amount of available resources on the train. |
| 41 | +The conductors have been very explicit in that they like this approach, it's so easy to just edit the files. Why would you do anything else? |
| 42 | + |
| 43 | +While clearly insane, you conclude that you need to support this JSON data input style for now. The application will have to talk to Azure storage blobs. How can you keep going with the integration tests while having to interact with a Microsoft service? |
| 44 | + |
| 45 | +Fortunately, Microsoft has provided us with a gift: [Azurite](https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azurite), a local emulator for Azure Blob Storage. It runs as a Docker container which is exactly what we need. |
| 46 | + |
| 47 | +The Azurite helper functions are already prepared for you in |
| 48 | +[custom_containers/azurite.py](integration_tests_ch7/custom_containers/azurite.py). Take a moment to read through the file and understand the three helpers: |
| 49 | + |
| 50 | +- `create_azurite_container()` — spins up the Azurite image bound on `0.0.0.0` so Docker can expose its ports to the |
| 51 | + host. It accepts a `name` which doubles as the network alias. |
| 52 | +- `azurite_connection_string_for_containers()` — builds the connection string. You will call this twice: once with the |
| 53 | + container alias (for container-to-container traffic on the Docker network) and once with `localhost` (for the host |
| 54 | + machine to seed data into the storage from within the fixture). |
| 55 | +- `ensure_blob_containers()` — uses the host-side connection string to pre-create the blob containers before the Train |
| 56 | + Logistics API starts looking for them. |
36 | 57 |
|
37 | | -While clearly insane, you conclude that you need to support this JSON data input style for now. The application will |
38 | | -have to talk to Azure storage blobs. How can you keep going with the integration tests while having to interact with a |
39 | | -Microsoft service? |
| 58 | +Your task is to implement a `train_logistics_storage` fixture in |
| 59 | +[conftest.py](integration_tests_ch7/conftest.py). The skeleton below shows where to start. |
40 | 60 |
|
41 | | -TODO: Setup azurite container in the tests |
| 61 | +```python |
| 62 | +# conftest.py |
| 63 | +@pytest.fixture |
| 64 | +def train_logistics_storage(network: Network) -> Generator[TrainLogisticsStorage, None, None]: |
| 65 | + azurite_container_name = "train-logistics" |
| 66 | + AZURITE_ACCOUNT: str = "devstoreaccount1" |
| 67 | + AZURITE_KEY: str = "Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==" |
| 68 | + |
| 69 | + with create_azurite_container(network=network, name=azurite_container_name) as container: |
| 70 | + # 1. Wait for the port mapping to be available (port 10000) |
| 71 | + # 2. Build docker_connection_string using azurite_container_name as host |
| 72 | + # 3. Build host_connection_string using "localhost" and the exposed port |
| 73 | + # 4. Call ensure_blob_containers with the host connection string |
| 74 | + # 5. Yield a TrainLogisticsStorage with an AzuriteStorageContainer inside |
| 75 | + ... |
| 76 | +``` |
| 77 | + |
| 78 | +> **Note on the Azurite credentials:** the account name and key above are the publicly documented defaults shipped with |
| 79 | +> every Azurite installation. They are not secrets — you will find them in the official Microsoft documentation and in |
| 80 | +> every tutorial on the internet. Do not lose sleep over them. |
| 81 | +
|
| 82 | +### The most important thing you will read in this chapter |
| 83 | + |
| 84 | +Keep `yield` **inside** the `with` block. If you place it outside, the context manager exits before your test runs, the Azurite container is destroyed, the Train Logistics API cannot connect to storage, and you will spend a very pleasant afternoon watching a retry loop print the same warning over and over again. You have been warned. |
42 | 85 |
|
43 | 86 | ## Task 3: Proper integration testing |
44 | 87 |
|
45 | | -TODO: Create an integration test which includes Azurite, Tickets API, Train Logistics™ and postgres. |
| 88 | +With Azurite in place you now have all four pillars ready to assemble into a proper multi-container integration test. |
| 89 | + |
| 90 | +| Container | Purpose | |
| 91 | +|-----------------------|-----------------------------------------------| |
| 92 | +| `postgres` | Shared database for both APIs | |
| 93 | +| `tickets_api` | The original ticketing application | |
| 94 | +| `train_logistics_api` | The new logistics application | |
| 95 | +| `azurite` | Azure Blob Storage emulator for JSON manifests | |
| 96 | + |
| 97 | + |
| 98 | +Image:  |
| 99 | + |
| 100 | +### Task 3a: The Train Logistics™ fixture |
| 101 | + |
| 102 | +The Train Logistics application lives in a remote repository and is published as a Docker image at |
| 103 | +`ghcr.io/equinor/train-logistics:latest`, exactly like the Tickets API in chapter 6. |
| 104 | + |
| 105 | +The helper functions `create_train_logistics_api_container()` and `wait_for_train_logistics_api_to_be_ready()` are already implemented for you in |
| 106 | +[custom_containers/train_logistics.py](integration_tests_ch7/custom_containers/train_logistics.py). Read through them. |
| 107 | + |
| 108 | +The container exposes port `3001` and requires two environment variables which the factory function already wires up for you: |
| 109 | +- `TRAIN_LOGISTICS_DATABASE_URL` — the postgres connection string using the `postgres` network alias. |
| 110 | +- `AZURE_STORAGE_CONNECTION_STRING` — the Azurite connection string, **docker-side** (using the container alias, not |
| 111 | + `localhost`). |
| 112 | + |
| 113 | +Your task is to add the `train_logistics_api` fixture in [conftest.py](integration_tests_ch7/conftest.py): |
| 114 | + |
| 115 | +```python |
| 116 | +# conftest.py |
| 117 | +@pytest.fixture |
| 118 | +def train_logistics_api( |
| 119 | + network: Network, |
| 120 | + postgres_database: PostgresDatabase, |
| 121 | + train_logistics_storage: TrainLogisticsStorage, |
| 122 | +) -> Generator[TrainLogisticsAPI]: |
| 123 | + # 1. Get the docker-side connection string for Azurite from train_logistics_storage |
| 124 | + # 2. Start the container using create_train_logistics_api_container() |
| 125 | + # 3. Wait for port 3001 mapping, build backend_url |
| 126 | + # 4. Wait for the API to be ready using wait_for_train_logistics_api_to_be_ready() |
| 127 | + # 5. Yield a TrainLogisticsAPI object |
| 128 | + ... |
| 129 | +``` |
| 130 | + |
| 131 | +The `TrainLogisticsAPI` wrapper class (already defined in `train_logistics.py`) expects: `container`, `backend_url`, `name`, `port`, and `alias`. |
| 132 | + |
| 133 | +### Task 3b: Write the integration tests |
| 134 | + |
| 135 | +Two tests need to be added to [test_integration.py](integration_tests_ch7/test_integration.py). |
| 136 | + |
| 137 | +**`test_buy_ticket`** — a familiar face from chapter 5. Make a `POST` request to `/tickets/buy` on the `tickets_api` container and verify the response body contains the expected fields. |
| 138 | + |
| 139 | +```python |
| 140 | +@pytest.mark.parametrize( |
| 141 | + "train_code,passenger_name,seat_number", |
| 142 | + [ |
| 143 | + ("The Orient Express", "Leonardo DaVinci", 14), |
| 144 | + ("Bergensbanen", "Jonas Gahr Støre", 1), |
| 145 | + ("Raumabanen", "Kong Harald", None), |
| 146 | + ], |
| 147 | +) |
| 148 | +def test_buy_ticket( |
| 149 | + tickets_api: TicketsAPI, |
| 150 | + train_code: str, |
| 151 | + passenger_name: str, |
| 152 | + seat_number: str | int, |
| 153 | +) -> None: |
| 154 | + ... |
| 155 | +``` |
| 156 | + |
| 157 | +**`test_check_stock`** — the first test written against the Train Logistics™ API. Make a `POST` request to `/logistics/check-stock` with a `train_code` and `product` and assert the response contains the correct stock level. |
| 158 | +This endpoint returns `200 OK` — it queries stock, it does not create anything. Do not assert `201 CREATED` unless you enjoy failing tests. |
| 159 | + |
| 160 | +```python |
| 161 | +@pytest.mark.parametrize( |
| 162 | + "train_code,product,expected_in_stock", |
| 163 | + [ |
| 164 | + ("The Orient Express", "banana", 10), |
| 165 | + ("Bergensbanen", "apple", 5), |
| 166 | + ("Raumabanen", "orange", 0), |
| 167 | + ], |
| 168 | +) |
| 169 | +def test_check_stock( |
| 170 | + train_logistics_api: TrainLogisticsAPI, |
| 171 | + train_code: str, |
| 172 | + product: str, |
| 173 | + expected_in_stock: int, |
| 174 | +) -> None: |
| 175 | + ... |
| 176 | +``` |
| 177 | + |
| 178 | +Run all tests with: |
| 179 | + |
| 180 | +```bash |
| 181 | +pytest -s . |
| 182 | +``` |
46 | 183 |
|
47 | | -Image:  |
| 184 | +If all four containers start, Azurite seeds correctly, both APIs accept requests and all assertions pass — congratulations. You have just run a four-container integration test from a single `pytest` command. No manual setup, no shared cloud bill, no angry conductor breathing down your neck. |
0 commit comments