Skip to content

Commit 1ecdaa5

Browse files
authored
Merge pull request #12 from The-Strategy-Unit/5_add-codeowners-and-readme
Add codeowners and readme
2 parents 6a92abb + d733531 commit 1ecdaa5

File tree

2 files changed

+34
-0
lines changed

2 files changed

+34
-0
lines changed

CODEOWNERS

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
# These owners will be the default owners for everything in
2+
# the repo. Unless a later match takes precedence,
3+
# @primary-owner and @secondary-owner will be requested for
4+
# review when someone opens a pull request.
5+
* @tomjemmett @The-Strategy-Unit/nhp_model_devs

README.md

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
# The New Hospital Programme Demand Model - Databricks implementation
2+
3+
This repository contains code for the Databricks implementation of the NHP model.
4+
5+
The notebooks in this repository can be used to run provider, ICB, and national level runs of the NHP model.
6+
7+
## Explanation of the NHP model Databricks implementation
8+
9+
nhp_model code is a package on GitHub. We tag specific versions of the nhp_model, e.g. v4.2.1.
10+
11+
This package has as its package dependency the tagged version of nhp_model code, available from GitHub.
12+
So for example, you can specify v4.2.1 or v4.1.0 of nhp_model. This is currently specified in the `pyproject.toml` file.
13+
14+
This repository contains specific Data loading classes for ICB, national and provider level model runs.
15+
These tell the nhp_model code to load the data for these runs from Databricks.
16+
17+
This repository also has workflow yaml file in the `resources` folder that tell Databricks what to do.
18+
The workflow is:
19+
20+
1. Look on Azure for params files in a specific folder, then
21+
1. Run a specific notebook using those params.
22+
23+
nhp_model_databricks uses the same extracted parquet data as the provider level model, just samples from it differently.
24+
It reads some of the reference tables directly from Databricks.
25+
26+
## How to use the NHP model Databricks implementation
27+
28+
This implementation is still a work in progress.
29+
We're keeping notes for internal use only in the [nhp_products Wiki](https://github.com/The-Strategy-Unit/nhp_products/wiki/How-to-run-the-ICB-and-national-level-models).

0 commit comments

Comments
 (0)