You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This PR updates documentation to include DLStreamer pipeline server references and modernizes the multimodal weld defect detection system requirements. The changes reflect a shift in the architecture to explicitly document the DLStreamer component's role in video processing and adds detailed pipeline configuration documentation.
Signed-off-by: Vellaisamy, Sathyendran <sathyendran.vellaisamy@intel.com>
Copy file name to clipboardExpand all lines: manufacturing-ai-suite/industrial-edge-insights-multimodal/docs/user-guide/weld-defect-detection/get-started.md
+38-3Lines changed: 38 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -57,21 +57,56 @@ RTSP stream and csv data over mqtt using simulator and publishing the anomaly re
57
57
58
58
Using the `edge-ai-suites/manufacturing-ai-suite/industrial-edge-insights-multimodal/weld-data-simulator/simulation-data/` which is a normalized version of open source data welding dataset from <https://huggingface.co/datasets/amr-lopezjos/Intel_Robotic_Welding_Multimodal_Dataset>.
59
59
60
+
The simulator reads `.avi` video files from the dataset and streams them as RTSP for vision data using the **mediamtx** server. This enables real-time video ingestion, simulating camera feeds for weld defect detection. The **dlstreamer-pipeline-server** connects to the RTSP stream and processes the video frames using a Geti model for automated defect analysis.
61
+
60
62
Timeseries data is being ingested into **Telegraf** using the **MQTT** protocol using the **weld-data-simulator** data simulator
61
63
Vision data is being ingested into **dlstreamer-pipeline-server** using the **RTSP** protocol using the **weld-data-simulator** data simulator
62
64
63
65
### **Data Ingestion**
64
66
65
-
**Telegraf** through its input plugins (**MQTT**) gathers the data and sends this input data to both **InfluxDB** and **Time Series Analytics Microservice**.
66
-
**dlstreamer-pipeline-server** gathers the data through RTSP Stream using **mediamxt** as the **RTSP Server**.
67
+
Vision Data: **dlstreamer-pipeline-server** gathers the data through RTSP Stream using **mediamtx** as the **RTSP Server**.
67
68
69
+
Time-series Data: **Telegraf** through its input plugins (**MQTT**) gathers the data and sends this input data to both **InfluxDB** and **Time Series Analytics Microservice**.
68
70
69
71
### **Data Storage**
70
72
71
73
**InfluxDB** stores the incoming data coming from **Telegraf**, **Time Series Analytics Microservice** and **Fusion Analytics** .
72
74
73
75
### **Data Processing**
74
76
77
+
**DL Streamer Pipeline Server** sends the images with overlaid bounding boxes through webrtc protocol to webrtc browser client. This is done via the MediaMTX server used for signaling. Coturn server is used to facilitate NAT traversal and ensure that the webrtc stream is accessible on a non-native browser client and helps in cases where firewall is enabled.
78
+
79
+
#### **`DL Streamer Pipeline Server config.json`**
|`name`| The name of the pipeline configuration. |`"weld_defect_classification"`|
86
+
|`source`| The source type for video ingestion. |`"gstreamer"`|
87
+
|`queue_maxsize`| Maximum size of the queue for processing frames. |`50`|
88
+
|`pipeline`| GStreamer pipeline string defining the video processing flow from RTSP source through classification to output. |`"rtspsrc location=\"rtsp://mediamtx:8554/live.stream\" latency=100 name=source ! rtph264depay ! h264parse ! decodebin ! videoconvert ! gvaclassify inference-region=full-frame name=classification ! gvametaconvert add-empty-results=true name=metaconvert ! queue ! gvafpscounter ! appsink name=destination"`|
89
+
|`parameters`| Configuration parameters for pipeline elements, specifically for the classification element properties. | See below for nested structure |
|`destination`| Configuration for output destinations of the pipeline. | Object containing metadata and frame settings |
104
+
|`metadata.type`| The protocol type for sending metadata information. |`"mqtt"`|
105
+
|`metadata.topic`| The MQTT topic where vision classification results are published. |`"vision_weld_defect_classification"`|
106
+
|`frame.type`| The protocol type for streaming video frames. |`"webrtc"`|
107
+
|`frame.peer-id`| Unique identifier for the WebRTC peer connection. |`"samplestream"`|
108
+
---
109
+
75
110
**Time Series Analytics Microservice** uses the User Defined Function(UDF) deployment package(TICK Scripts, UDFs, Models) which is already built-in to the container image. The UDF deployment package is available
76
111
at `edge-ai-suites/manufacturing-ai-suite/industrial-edge-insights-multimodal/config/time-series-analytics-microservice`. Directory details is as below:
77
112
@@ -240,5 +275,5 @@ Use the following command to verify that all containers are active and error-fre
240
275
## Advanced setup
241
276
242
277
- [How to build from source and deploy](./how-to-build-from-source.md): Guide to build from source and docker compose deployment
243
-
- [How to configure OPC-UA/MQTT alerts](./how-to-configure-alerts.md): Guide forconfiguring the OPC-UA/MQTT alertsin the Time Series Analytics microservice
278
+
- [How to configure MQTT alerts](./how-to-configure-alerts.md): Guide forconfiguring the MQTT alertsin the Time Series Analytics microservice
244
279
- [How to configure custom UDF deployment package](./how-to-configure-custom-udf.md): Guide for deploying a customized UDF deployment package (udfs/models/tick scripts)
Copy file name to clipboardExpand all lines: manufacturing-ai-suite/industrial-edge-insights-multimodal/docs/user-guide/weld-defect-detection/how-to-build-from-source.md
+13-3Lines changed: 13 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,17 @@ before proceeding with the below steps.
7
7
8
8
## Steps to Build from Source
9
9
10
-
1.**Clone the source and build the `Time Series Analytics` microservice**:
10
+
1.**Clone the source and build the `DLStreamer Pipeline Server` microservice**:
Copy file name to clipboardExpand all lines: manufacturing-ai-suite/industrial-edge-insights-multimodal/docs/user-guide/weld-defect-detection/system-requirements.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ This page provides detailed hardware, software, and platform requirements to hel
0 commit comments