API for hosting nowcasting solar predictions. This is for GSP and National forecasts in the UK.
We use FastAPI.
Pull the docker image from
docker pull openclimatefix/nowcasting_api:latest
You will need to set the following environment variables:
AUTH0_DOMAIN
- The Auth0 domain which can be collected from the Applications/Applications tab. It should be something like 'XXXXXXX.eu.auth0.com'AUTH0_API_AUDIENCE
- THE Auth0 api audience, this can be collected from the Applications/APIs tab. It should be something likehttps://XXXXXXXXXX.eu.auth0.com/api/v2/
DB_URL
- The Forecast database URL used to get GSP forecast dataORIGINS
- Endpoints that are valid CORS origins. See FastAPI documentation.N_HISTORY_DAYS
- Default is just to load data from today and yesterday, but we can set this to 5, if we want the api always to return 5 days of dataADJUST_MW_LIMIT
- the maximum the api is allowed to adjust the national forecast byFAKE
- This allows fake data to be used, rather than connecting to a databaseQUERY_WAIT_SECONDS
- The number of seconds to wait for an on going queryCACHE_TIME_SECONDS
- The time in seconds to cache the data is used forDELETE_CACHE_TIME_SECONDS
- The time in seconds to after which the cache is deleteLOGLEVEL
- The log level for the application.
Note you will need a database set up at DB_URL
. This should use the datamodel in nowcasting_datamodel
There are several optional environment variables:
N_CALLS_PER_HOUR
- API rate limit for most endpoints. Defaults to 3600 (1 per second).N_SLOW_CALLS_PER_MINUTE
- API rate limit for slow endpoints. Defaults to 1 (1 per minute).
Live documentation can be viewed at https://api.quartz.solar/docs
or https://api.quartz.solar/swagger
.
This is automatically generated from the code.
This can be done it two different ways: With Python or with Docker. The Docker method is preferred, because:
- a) this should be more replicable and less prone to odd behaviors;
- b) it also sets up a CRON service that generates new data periodically, to resemble the "real" forecast service.
Create a virtual env
python3 -m venv ./venv
source venv/bin/activate
π’ Preferred method
- Make sure docker is installed on your system.
- Use
docker compose up
in the main directory with the optional--build
flag to build the image the first time to start up the application. This builds the image, sets up the database, seeds some fake data and starts the API. - You will now be able to access it on
http://localhost:8000
- The API should restart automatically when you make changes to the code, and the CRON job will periodically seed new fake data, currently set to every 15 minutes.
Option 2: Running docker with a local version of nowcasting_datamodel
- Clone the nowcasting_datamodel repository
- Comment out the
nowcasting_datamodel
line in therequirements.txt
file - Run
docker compose up --file docker-compose-local-datamodel.yml
in the main directory, with the optional--build
flag to build the image the first time; this will start up the application and seed the initial fake data in the database. - You will now be able to access it on
http://localhost:8000
. Changes you make to the API code will be automatically reflected in the running API, but changes to the datamodel will either require a change of any kind in the API code that will reload the server, or a manual restart of the API. - Data will reseed every 15 minutes.
Option 3: Running the API with a local database (deprecated, but possible if unable to use Docker, may require some troubleshooting)
To set up the API with a local database, you will need to:
- start your own local postgres instance on your machine
- set
DB_URL
to your local postgres instance in the.env
file - run the following commands to install required packages, create the tables in your local postgres instance, and populate them with fake data:
pip install -r requirements.txt
python script/fake_data.py
cd nowcasting_api
uvicorn main:app --reload
When running locally:
- You will now be able to access it on
http://localhost:8000
- The API should restart automatically when you make changes to the code, but the fake data currently is static. To seed new fake data, just manually restart the API.
To run tests use the following command
docker stop $(docker ps -a -q)
docker-compose -f test-docker-compose.yml build
docker-compose -f test-docker-compose.yml run api
graph TD;
N1(national/forecast) --> Q1;
Q1{Include metadata?>} -->|no| Q2;
Q1 --> |yes| N2[NationalForecast];
N4[ForecastValueLatest];
Q2{forecast horizon <br> minutes not None}
Q2-->|yes| N5[ForecastValueSevenDays];
Q2-->|no| N4;
NP1(national/pvlive)-->NP2;
NP2[GSPYield];
graph TD;
G1(gsp/forecast/all);
G1--> N3[ManyForecasts];
G3(gsp/gsp_id/forecast) -->Q4;
Q4{forecast horizon <br> minutes not None}
Q4-->|yes| G7[ForecastValueSevenDays];
Q4-->|no| G6[ForecastValueLatest];
GP1(gsp/pvlive/all)-->GP2;
GP2[LocationWithGSPYields];
GP3(gsp/gsp_id/pvlive)-->GP4;
GP4[GSPYield];
graph TD;
G1(status)-->G2;
G2[Status];
G3(gsp)-->G4
G4[Location]
Some users want to know what the forecast was like N hours ago. We can do this by setting
forecast_hoirzon_minutes
in the API.
Because the API provider forecasts in the future and historic values, it is useful to define this behaviour for N hour forecast.
- future: A forecast that was made N hours ago for the future. For example, if its now 2025-01-01 12:00, the future will show a forecast made at 2025-01-01 08:00, from now to 2025-01-02 20:00 (a 36 hour forecast)
- past: Forecast values that were made N hours before the target time. For example, a target_time of 2025-01-01 11:00 will show a forecast value made at 2025-01-01 07:00.
- PR's are welcome! See the Organisation Profile for details on contributing
- Find out about our other projects in the here
- Check out the OCF blog for updates
- Follow OCF on LinkedIn
Thanks goes to these wonderful people (emoji key):
Part of the Open Climate Fix community.