You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# Hyperlocal Bed Demand Forecasting (User Interface)
4
4
5
5
User interface for the HYLODE project Much of the project structure is based
6
6
on the[`govcookiecutter` template project][govcookiecutter] but is adapted
@@ -14,178 +14,8 @@ code including the backend is in `./src/api/app1`, and the application itself is
14
14
kept in `./src/apps/app1`. An annotated figure of the directory structure is shown
15
15
below.
16
16
17
-
## Quick start
17
+
The documentation is available [here](docs/index.md).
18
18
19
-
### Deployment not development
20
-
21
-
From the commandline of the GAE
22
-
23
-
```sh
24
-
git clone https://github.com/HYLODE/HyUi.git
25
-
cd HyUi
26
-
cp .env.example .env
27
-
# Now hand edit the .env file with usernames/passwords
28
-
# Set ENV=prod (rather than dev)
29
-
pytest # OPTIONAL
30
-
docker-compose up -d --build && docker-composes logs -f
31
-
```
32
-
33
-
Go to [](http://my-host-name:8094/docs) for the API
34
-
Go to [](http://my-host-name:8095) for the dashboard landing page
35
-
36
-
### Development (local)
37
-
38
-
### Development (GAE)
39
-
40
-
41
-
42
-
## First run
43
-
44
-
### Installation
45
-
46
-
You will need to do this twice:
47
-
48
-
- on your local development machine
49
-
- on the UCLH generic application environment
50
-
51
-
Regardless, open the terminal and **git clone**.
52
-
53
-
```sh
54
-
git clonehttps://github.com/HYLODE/HyUi.git
55
-
cd HyUi
56
-
```
57
-
58
-
### Development (Local)
59
-
60
-
We assume that the majority of the development work will be on your own machine. You will therefore need to set-up your own dev environment. I have been using [conda miniforge](https://github.com/conda-forge/miniforge) because this has the best compatibility with the ARM processor on the Mac. I think it should also work on other machines but I have not tested this.
then navigate to [http://localhost:8094/docs]() to view the API documentation
85
-
86
-
... `app/main.py` hosts the various routes for the different apps
87
-
88
-
89
-
Frontend (per app)
90
-
91
-
```sh
92
-
cd ./src
93
-
python apps/app1/index.py
94
-
```
95
-
96
-
97
-
#### Local development with docker
98
-
99
-
100
-
### Development (Hospital)
101
-
102
-
There are two tasks that _must_ happen within the hospital environment: (1) preparing realistic mock data (2) deployment. The installation steps differ because here we do not have **sudo** (root) access (admin privileges). This means work must be performed using a combination of the default command line tools and docker.
103
-
104
-
### Preparing mock (synthetic) data
105
-
106
-
We will use the tooling provided by the [Synthetic Data Lab](https://sdv.dev) from a JupyterLab in a browser on a hospital machine. You will need access to the GAE to run this.
107
-
108
-
#### Scenario 1 (data via SQL)
109
-
110
-
Ensure that your database credentials are stored in the `./.secrets` file.
111
-
From the GAE commandline, navigate to the `synth` directory (`cd ./synth`), then use the Makefile commands as follows
112
-
113
-
1.`make mock1build` to build a docker image with JupyterLab and sdv pre-installed.
114
-
2.`make mock2run` to spin up JupyterLab (e.g. Working from GAE07 this will be at [](http://UCLVLDDPRAGAE07:8091) but the URL server will depend on the GAE).
115
-
3.`make mock3copyin` This will copy the example notebook `synth_test_data.ipynb` into the `./synth/portal` directory that is attached to the JupyterNotebook. Now open the example notebook `synth_test_data.ipynb` using JupyterLab and work through the steps. Create your SQL query and save as `query_live.sql` file must return a *SELECT* statement. Save just the synthesised mock data to `mock.h5`, and the query (`query_live.sql`). Be **careful** and ensure that you specify 'fakers' for all direct identifiers. We recommend the four eyes approach wherein a second person reviews your work before an export.
116
-
4.`make mock4copyout` to copy just the query and the synthetic data. Do not copy the notebook out of the `portal` directory unless you are sure you have stripped all personally identifiable data (i.e. clear all cells before saving).
117
-
5.`make mock5stop` to stop the JupyterLab instance and clear up
118
-
119
-
#### Scenario 2 (data via an http `get` request)
120
-
121
-
This is similar to the steps above but does not depend on the query or database credentials. You are likely to need to use the Python requests library to get the data that will be used by [sdv](https://sdv.dev).
122
-
123
-
**YOU MUST NOW PREPARE YOUR DATA MODEL IN `./src/api/models.py`**
124
-
This is a quality control step that ensures the data you handle is of the correct type.
125
-
let's generalise the naming so that query is matched to results which has rows
126
-
and results is a pydantic / sqlmodel class
127
-
by hand, specify as per ... https://sqlmodel.tiangolo.com/tutorial/create-db-and-table/
128
-
129
-
130
-
A simple Pandas dataframe with two string columns and a timestamp.
131
-
132
-
```python
133
-
>>> df.types
134
-
firstname object
135
-
lastname object
136
-
admission_datetime datetime64[ns]
137
-
```
138
-
139
-
The equivalent SQLModel. Note that `firstname` is optional but that `lastname` and `admission_datetime` are not.
140
-
141
-
```python
142
-
from sqlmodel import SQLModel
143
-
from datetime import datetime
144
-
145
-
classResultsBase(SQLModel):
146
-
"""
147
-
Generic results class to hold data returned from
148
-
the SQL query or the API
149
-
"""
150
-
firstname: Optional[str]
151
-
lastname: str
152
-
admission_datetime: datetime
153
-
154
-
```
155
-
156
-
You can also use the [`@validator`](https://pydantic-docs.helpmanual.io/usage/validators/) decorator function to add additional validation.
157
-
### Deployment
158
-
159
-
Set the environment variable to *prod*, then run *docker-compose*.
160
-
161
-
```sh
162
-
export ENV=prod
163
-
pytest
164
-
docker-compose up -d --build && docker-compose logs -f
165
-
```
166
-
167
-
You will need create a local `./.secrets` file with database credentials so preferably set the *ENV* to `prod` there.
168
-
169
-
170
-
171
-
172
-
173
-
## Frontend vs Backend
174
-
175
-
### Backend
176
-
This is a Python FastApi server that is exposed on port 8094 when running `docker-compose up -d` from the project root, or `uvicorn main:app` when running from `src/api/` locally.
177
-
178
-
### Frontend
179
-
This is a Plotly Dash app served on port 8095.
180
-
181
-
### Orchestrating front and back end
182
-
183
-
**IMPORTANT**: ensure you have created a ./.secrets file with at least the same info as the ./.secrets.example version
184
-
185
-
```bash
186
-
docker-compose down
187
-
docker-compose up -d --build && docker-compose logs -f
0 commit comments