- Time to Complete: 30 minutes
- Programming Language: Python 3
- Run Docker as Non-Root: Follow the steps in Manage Docker as a non-root user.
- Configure Proxy (if required):
- Set up proxy settings for Docker client and containers as described in Docker Proxy Configuration.
- Example
~/.docker/config.json:{ "proxies": { "default": { "httpProxy": "http://<proxy_server>:<proxy_port>", "httpsProxy": "http://<proxy_server>:<proxy_port>", "noProxy": "127.0.0.1,localhost" } } } - Configure the Docker daemon proxy as per Systemd Unit File.
- Enable Log Rotation:
- Add the following configuration to
/etc/docker/daemon.json:{ "log-driver": "json-file", "log-opts": { "max-size": "10m", "max-file": "5" } } - Reload and restart Docker:
sudo systemctl daemon-reload sudo systemctl restart docker
- Add the following configuration to
The data flow remains same as that explained in the Overview.md. Let's specifically talk about the wind turbine anomaly detection use case here by ingesting the data using the OPC-UA simulator and publishing the anomaly alerts to MQTT broker.
Using the edge-ai-suites/manufacturing-ai-suite/wind-turbine-anomaly-detection/simulator/simulation_data/windturbine_data.csv which is a normalized version of open source data wind turbine dataset (edge-ai-suites/manufacturing-ai-suite/wind-turbine-anomaly-detection/training/T1.csv) from https://www.kaggle.com/datasets/berkerisen/wind-turbine-scada-dataset.
This data is being ingested into Telegraf using the OPC-UA protocol using the OPC-UA data simulator.
Telegraf through its input plugins (OPC-UA OR MQTT) gathers the data and sends this input data to both InfluxDB and Time Series Analytics Microservice.
InfluxDB stores the incoming data coming from Telegraf.
Time Series Analytics Microservice uses the User Defined Function(UDF) deployment package(TICK Scripts, UDFs, Models) which is already built-in to the container image. The UDF deployment package is available
at edge-ai-suites/manufacturing-ai-suite/wind-turbine-anomaly-detection/time_series_analytics_microservice. Directory details is as below:
The task section defines the settings for the Kapacitor task and User-Defined Functions (UDFs).
| Key | Description | Example Value |
|---|---|---|
fetch_from_model_registry |
Boolean flag to enable fetching UDFs and models from the Model Registry. | true or false |
version |
Specifies the version of the task or model to use. | "1.0" |
tick_script |
The name of the TICK script file used for data processing and analytics. | "windturbine_anomaly_detector.tick" |
task_name |
The name of the Kapacitor task. | "windturbine_anomaly_detector" |
udfs |
Configuration for the User-Defined Functions (UDFs). | See below for details. |
UDFs Configuration:
The udfs section specifies the details of the UDFs used in the task.
| Key | Description | Example Value |
|---|---|---|
type |
The type of UDF. Currently, only python is supported. |
"python" |
name |
The name of the UDF script. | "windturbine_anomaly_detector" |
models |
The name of the model file used by the UDF. | "windturbine_anomaly_detector.pkl" |
Alerts Configuration:
The alerts section defines the settings for alerting mechanisms, such as MQTT protocol.
For OPC-UA configuration, please refer Publishing OPC-UA alerts
MQTT Configuration:
The mqtt section specifies the MQTT broker details for sending alerts.
| Key | Description | Example Value |
|---|---|---|
mqtt_broker_host |
The hostname or IP address of the MQTT broker. | "ia-mqtt-broker" |
mqtt_broker_port |
The port number of the MQTT broker. | 1883 |
name |
The name of the MQTT broker configuration. | "my_mqtt_broker" |
kapacitor_devmode.confwould be updated as per the aboveconfig.jsonat runtime for usage.
- Contains the python script to process the incoming data. Uses Random Forest Regressor and Linear Regression machine learning algos accelerated with Intel® Extension for Scikit-learn* to run on CPU to detect the anomalous power generation data points relative to wind speed.
- The TICKScript
windturbine_anomaly_detector.tickdetermines processing of the input data coming in. Mainly, has the details on execution of the UDF file, storage of processed data and publishing of alerts. By default, it is configured to publish the alerts to MQTT.
- The
windturbine_anomaly_detector.pklis a model built using the RandomForestRegressor Algo. More details on how it is built is accessible atedge-ai-suites/manufacturing-ai-suite/wind-turbine-anomaly-detection/training/windturbine/README.md
git clone https://github.com/open-edge-platform/edge-ai-suites.git
cd edge-ai-suites/manufacturing-ai-suite/wind-turbine-anomaly-detection-
Update the following fields in
.env:INFLUXDB_USERNAMEINFLUXDB_PASSWORDVISUALIZER_GRAFANA_USERVISUALIZER_GRAFANA_PASSWORDMR_PSQL_PASSWORDMR_MINIO_ACCESS_KEYMR_MINIO_SECRET_KEY
-
Deploy the sample app, use only one of the options below:
NOTE: The sample app is deployed by pulling the pre-built container images of the sample app from the docker hub OR from the internal container registry (login to the docker registry from cli and configure
DOCKER_REGISTRYenv variable in.envfile atedge-ai-suites/manufacturing-ai-suite/wind-turbine-anomaly-detection)
- Using OPC-UA ingestion:
make up_opcua_ingestion
- Using MQTT ingestion:
make up_mqtt_ingestion
Use the following command to verify that all containers are active and error-free.
Note: The command
make statusmay show errors in containers like ia-grafana when user have not logged in for the first login OR due to session timeout. Just login again in Grafana and functionality wise if things are working, then please ignoreuser token not founderrors along with other minor errors which may show up in Grafana logs.
make status-
Get into the InfluxDB* container:
Note: Use
kubectl exec -it <influxdb-pod-name> -- /bin/bashfor the helm deploymentdocker exec -it ia-influxdb bash -
Run below commands to see the data in InfluxDB*:
NOTE: Please ignore the error message
There was an error writing history file: open /.influx_history: read-only file systemhappening in the InfluxDB shell. This does not affect any functionality while working with the InfluxDB commands# For below command, the INFLUXDB_USERNAME and INFLUXDB_PASSWORD needs to be fetched from `.env` file # for docker compose deployment and `values.yml` for helm deployment influx -username <username> -password <passwd> use datain # database access show measurements # Run below query to check and output measurement processed # by Time Series Analytics microservice select * from wind_turbine_anomaly_data
-
To check the output in Grafana, follow the below steps.
-
Use link
http://<host_ip>:3000to launch Grafana from browser (preferably, chrome browser)Note: Use link
http://<host_ip>:30001to launch Grafana from browser (preferably, chrome browser) for the helm deployment -
Login to the Grafana with values set for
VISUALIZER_GRAFANA_USERandVISUALIZER_GRAFANA_PASSWORDin.envfile and select Wind Turbine Dashboard. -
One will see the below output.
-
make down-
Check container logs to catch any failures:
docker ps docker logs -f <container_name> docker logs -f <container_name> | grep -i error
- How to Deploy with Helm: Guide for deploying the sample application on a k8s cluster using Helm.
- How to Deploy with Edge Orchestrator: Guide for deploying the sample application using Edge Manageability Framework
- How to build from source and deploy: Guide to build from source and docker compose deployment
- How to configure OPC-UA/MQTT alerts: Guide for configuring the OPC-UA/MQTT alerts in the Time Series Analytics microservice
- How to configure custom UDF deployment package: Guide for deploying a customized UDF deployment package (udfs/models/tick scripts)



