This application aggregates chargepoint data from different sources and provices it without authentication using an OCPI-style JSON-API and a vector tile server.
| name | uid | realtime | credentials | comment |
|---|---|---|---|---|
| Bundesnetzagentur: API | bnetza_api | false | false | Additional config ignore_operators:: list[str] is supported, which will ignore given operators during import. Set to weekly download, as it does not change so often. |
| Bundesnetzagentur: Excel | bnetza_excel | false | false | |
| chargecloud: Stadtwerke Pforzheim | chargecloud_pforzheim | true | false | |
| chargecloud: Stadtwerke Stuttgart | chargecloud_stuttgart | true | false | |
| chargecloud: Stadtwerke Tübingen | chargecloud_tuebingen | true | false | |
| chargecloud: Stadtwerke Ludwigsburg | chargecloud_ludwigsburg | true | true | |
| PBW | eaaze_pbw | true | true | |
| Giro-e | giroe | true | true | |
| Heilbronn Heckarbogen | heilbronn_neckarbogen | true | true | |
| Lichtblick | lichtblick | true | true | Currently dysfunctional |
| OCHP: Albwerk | ochp_albwerk | true | true | |
| OCHP: Ladenetz | ochp_ladenetz | true | true | |
| OCPI: Stadtnavi | ocpi_stadtnavi | true | false | |
| OpenData Swiss | opendata_swiss | true | false |
There is a matching algorithm which matches live data sources with bnetza sources. You can find details at our matching docs.
At api.ocpdb.de you will find an OpenAPI documentation of public endpoints you can use.
The application provides a simple command line interface. You can access any cli command from within the container. The makefile provides a shortcut to run the cli:
make docker-run CMD="flask db upgrade"flask import allflask import static example_sourceflask import realtime example_sourceflask import images example_sourceflask source listflask source delete example_sourceflask match runOCPDB extends the Location data model with a new field official_regional_code in non-strict mode. This field provides
the official regional code of a location, if available. Following regional codes are used:
- DEU: Regionalschlüssel
In Germany, we use the dataset Verwaltungsgebiete 1:25 000 (VG25) by Bundesamt für Kartographie und Geodäsie (BKG) for assigning official regional codes to locations, licenced as Creative Commons Namensnennung 4.0 International. We download the data using wget, transform the data using ogr2ogr and store it our Postgis database.
The script assumes that a once-only import is sufficient. You must delete the data/regionalschluessel/.vg25-imported "marker file" (and re-run the script) to trigger a re-import.
It also assumes that the data at the VG25 URL is immutable, the data will be downloaded only once. You must delete data/regionalschluessel/vg25.gpkg (and re-run the script) to trigger a re-download.
data/regionalschluessel/vg25.gpkg in order to trigger the download again, and data/regionalschluessel/.vg25-imported
to trigger the import again. You can use this mechanism for ansible automatization, too: if you drop the geopackage
at data/regionalschluessel/vg25.gpkg via ansible, you won't need to download the file during runtime.
You can run
flask location assign-regionalschluesselto assign regional codes to all locations already in the database. You can limit it to specific locations by providing the source id:
flask location assign-regionalschluessel --source-id 1In case of an Regionalschlüssel file update, make sure that the new geopackage has the same format as the old one. Afterwards, you can run
flask location assign-regionalschluessel --re-assignto re-assign regional codes to all locations.
The installation process is documented at INSTALL.md.
The application uses the logging module with some optional extensions. Logging can be configured using the config.yml.
The application provides some additional context and / or special output formats to log entries with custom formatters:
Most requests or tasks have context which can be used in log entries. The simplest is
LOGGING:
formatters:
human_readable:
(): webapp.common.logging.formatter.flask_attributes_formatter.FlaskAttributesFormatter
format: '%(asctime)s %(levelname)s %(source)s: %(message)s'
defaults: {'source': '-'}
handlers:
my_handler:
formatter: human_readableWith this example, you add the source to every log entry (if available). Please keep in mind that you need to
add a default, because not every log entry has a source_uid context.
Following additional log context variables are available:
sourceinitiatorlocationevseimage
You can output log entries in OpenTelemetry format, too:
LOGGING:
formatters:
open_telemetry:
(): webapp.common.logging.formatter.flask_open_telemetry_formatter.FlaskOpenTelemetryFormatter
prefix: ocpdb
service_name: OCPDB
handlers:
my_handler:
formatter: open_telemetryContext is automatically injected into log entry Attributes.
OCPDB is under AGPL. You will find details at the LICENCE.txt.
We appreciate bug reports and feature requests.