diff --git a/manufacturing-ai-suite/industrial-edge-insights-multimodal/helm/README.md b/manufacturing-ai-suite/industrial-edge-insights-multimodal/helm/README.md index 39eea59e49..8246e80ab9 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-multimodal/helm/README.md +++ b/manufacturing-ai-suite/industrial-edge-insights-multimodal/helm/README.md @@ -1 +1 @@ -Please refer [link](../docs/user-guide/how-to-guides/how-to-deploy-with-helm.md) for the helm deployment \ No newline at end of file +Please refer [link](../docs/user-guide/get-started/deploy-with-helm.md) for the helm deployment \ No newline at end of file diff --git a/manufacturing-ai-suite/industrial-edge-insights-time-series/docs/user-guide/how-to-guides/write-user-defined-function.md b/manufacturing-ai-suite/industrial-edge-insights-time-series/docs/user-guide/how-to-guides/write-user-defined-function.md index 9139cb1bb6..e58be71154 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-time-series/docs/user-guide/how-to-guides/write-user-defined-function.md +++ b/manufacturing-ai-suite/industrial-edge-insights-time-series/docs/user-guide/how-to-guides/write-user-defined-function.md @@ -268,8 +268,8 @@ var data = stream - [Kapacitor UDF Documentation](https://docs.influxdata.com/kapacitor/v1/guides/anomaly_detection/#writing-a-user-defined-function-udf) - [Example UDFs in Repository](https://github.com/open-edge-platform/edge-ai-suites/blob/main/manufacturing-ai-suite/industrial-edge-insights-time-series/apps/) - [Wind Turbine Anomaly Detection](https://github.com/open-edge-platform/edge-ai-suites/blob/main/manufacturing-ai-suite/industrial-edge-insights-time-series/apps/wind-turbine-anomaly-detection/time-series-analytics-config/udfs/windturbine_anomaly_detector.py) - - [Weld Anomaly Detection](https://github.com/open-edge-platform/edge-ai-suites/blob/main/manufacturing-ai-suite/industrial-edge-insights-time-series/apps/wind-turbine-anomaly-detection/time-series-analytics-config/udfs/weld_anomaly_detector.py) + - [Weld Anomaly Detection](https://github.com/open-edge-platform/edge-ai-suites/blob/main/manufacturing-ai-suite/industrial-edge-insights-time-series/apps/weld-anomaly-detection/time-series-analytics-config/udfs/weld_anomaly_detector.py) - [Kapacitor TICKscript Reference](https://docs.influxdata.com/kapacitor/v1/reference/tick/introduction/) - [Wind Turbine Anomaly Detection](https://github.com/open-edge-platform/edge-ai-suites/blob/main/manufacturing-ai-suite/industrial-edge-insights-time-series/apps/wind-turbine-anomaly-detection/time-series-analytics-config/tick_scripts/windturbine_anomaly_detector.tick) - - [Weld Anomaly Detection](https://github.com/open-edge-platform/edge-ai-suites/blob/main/manufacturing-ai-suite/industrial-edge-insights-time-series/apps/wind-turbine-anomaly-detection/time-series-analytics-config/tick_scripts/weld_anomaly_detector.tick) + - [Weld Anomaly Detection](https://github.com/open-edge-platform/edge-ai-suites/blob/main/manufacturing-ai-suite/industrial-edge-insights-time-series/apps/weld-anomaly-detection/time-series-analytics-config/tick_scripts/weld_anomaly_detector.tick) - [Configure Custom UDF Deployment](./configure-custom-udf.md) diff --git a/manufacturing-ai-suite/industrial-edge-insights-time-series/helm/README.md b/manufacturing-ai-suite/industrial-edge-insights-time-series/helm/README.md index cc2728e789..0881c944c1 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-time-series/helm/README.md +++ b/manufacturing-ai-suite/industrial-edge-insights-time-series/helm/README.md @@ -1,3 +1,3 @@ # Deploy using Helm Charts -Please refer to [Deploy with helm](https://docs.openedgeplatform.intel.com/2025.2/edge-ai-suites/ai-suite-manufacturing/industrial-edge-insights-time-series/get-started/deploy-with-helm.html). +Please refer to [Deploy with helm](https://docs.openedgeplatform.intel.com/dev/edge-ai-suites/ai-suite-manufacturing/industrial-edge-insights-time-series/get-started/deploy-with-helm.html). diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/apps/pallet-defect-detection/README_dockerhub.md b/manufacturing-ai-suite/industrial-edge-insights-vision/apps/pallet-defect-detection/README_dockerhub.md index 767a041786..026b945684 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/apps/pallet-defect-detection/README_dockerhub.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/apps/pallet-defect-detection/README_dockerhub.md @@ -10,7 +10,7 @@ For more details on deployment, refer to the [documentation](https://docs.opened ## Deploy using Kubernetes Charts --- -For more details on deployment, refer to the [documentation](https://docs.openedgeplatform.intel.com/dev/edge-ai-suites/ai-suite-manufacturing/industrial-edge-insights-vision/pallet-defect-detection/how-to-guides/deploy-with-helm.html). +For more details on deployment, refer to the [documentation](https://docs.openedgeplatform.intel.com/dev/edge-ai-suites/ai-suite-manufacturing/industrial-edge-insights-vision/pallet-defect-detection/get-started/deploy-with-helm.html). ## Supported versions diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/apps/weld-porosity/README_dockerhub.md b/manufacturing-ai-suite/industrial-edge-insights-vision/apps/weld-porosity/README_dockerhub.md index 843102667a..e2b118b83d 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/apps/weld-porosity/README_dockerhub.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/apps/weld-porosity/README_dockerhub.md @@ -10,7 +10,7 @@ For more details on deployment, refer to the [documentation](https://docs.opened ## Deploy using Kubernetes Charts --- -For more details on deployment, refer to the [documentation](https://docs.openedgeplatform.intel.com/dev/edge-ai-suites/ai-suite-manufacturing/industrial-edge-insights-vision/weld-porosity/how-to-guides/deploy-with-helm.html). +For more details on deployment, refer to the [documentation](https://docs.openedgeplatform.intel.com/dev/edge-ai-suites/ai-suite-manufacturing/industrial-edge-insights-vision/weld-porosity/get-started/deploy-with-helm.html). ## Supported versions diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/apps/worker-safety-gear-detection/README_dockerhub.md b/manufacturing-ai-suite/industrial-edge-insights-vision/apps/worker-safety-gear-detection/README_dockerhub.md index bdd767e01d..12dec7f5de 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/apps/worker-safety-gear-detection/README_dockerhub.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/apps/worker-safety-gear-detection/README_dockerhub.md @@ -10,7 +10,7 @@ For more details on deployment, refer to the [documentation](https://docs.opened ## Deploy using Kubernetes Charts --- -For more details on deployment, refer to the [documentation](https://docs.openedgeplatform.intel.com/dev/edge-ai-suites/ai-suite-manufacturing/industrial-edge-insights-vision/worker-safety-gear-detection/how-to-deploy-using-helm-charts.html). +For more details on deployment, refer to the [documentation](https://docs.openedgeplatform.intel.com/dev/edge-ai-suites/ai-suite-manufacturing/industrial-edge-insights-vision/worker-safety-gear-detection/get-started/deploy-with-helm.html). ## Supported versions diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pallet-defect-detection/get-started/environment-variables.md b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pallet-defect-detection/get-started/environment-variables.md index ab74607209..896933ef0c 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pallet-defect-detection/get-started/environment-variables.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pallet-defect-detection/get-started/environment-variables.md @@ -9,4 +9,4 @@ This reference application's configuration has the following environment variabl In addition to the ones above, the application also uses environment variables of following Microservice: -- [DL Streamer Pipeline Server](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/environment-variables.html) +- [DL Streamer Pipeline Server](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/get-started/environment-variables.html) diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pallet-defect-detection/how-to-guides/enable-mlops.md b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pallet-defect-detection/how-to-guides/enable-mlops.md index de2b7ebd5b..f82411b777 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pallet-defect-detection/how-to-guides/enable-mlops.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pallet-defect-detection/how-to-guides/enable-mlops.md @@ -95,9 +95,9 @@ With this feature, during runtime, you can download a new model using the micros ![WebRTC streaming](../_assets/webrtc-streaming.png) - ### Downloading model with Model Download + ### Downloading model with Model Download - At this point, user would like to restart the pipeline with a newer model. The new model can bea retrained version of the existing model or a different model altogether. We use [Model Download](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/Overview.md) microservice to help download the model. It supports downloading public models as well as geti models from a running Geti server. To learn more about it, see [here](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/get-started.md). + At this point, user would like to restart the pipeline with a newer model. The new model can bea retrained version of the existing model or a different model altogether. We use [Model Download](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/index.md) microservice to help download the model. It supports downloading public models as well as geti models from a running Geti server. To learn more about it, see [here](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/get-started.md). For our demonstration, we will assume the pallet defect detection model has been retrained and is available for downloaded from a Geti server using the Model Download service. Also, the downloaded location is accessible by the dlstreamer pipeline server. In our example, it is `/tmp/tmp-models`. The `/tmp`dir is already accessible by the sample application. If not, please add it to the `volumes` section of `dlstreamer-pipeline-server service in docker-compose file. @@ -107,7 +107,7 @@ With this feature, during runtime, you can download a new model using the micros curl -k --location -X DELETE https:///api/pipelines/{instance_id} ``` 10. Start a new pipeline with this new model. Before that modify the payload.json to use this new model in `apps/pallet-defect-detection/payload.json`. Notice the model path in the payload has changed to the new model. - + ```json [ { diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pallet-defect-detection/how-to-guides/manage-pipelines.md b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pallet-defect-detection/how-to-guides/manage-pipelines.md index 1b98f33574..32402fce6c 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pallet-defect-detection/how-to-guides/manage-pipelines.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pallet-defect-detection/how-to-guides/manage-pipelines.md @@ -33,8 +33,8 @@ The following is an example of the pallet defect detection pipeline, which is in Customize the pipeline according to your needs. For details, see the following DL Streamer Pipeline Server documentation: -- [Launch configurable pipelines](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/how-to-launch-configurable-pipelines.html) -- [Autostart pipelines](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/how-to-autostart-pipelines.html) +- [Launch configurable pipelines](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/how-to-guides/launch-configurable-pipelines.html) +- [Autostart pipelines](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/how-to-guides/autostart-pipelines.html) ## Start the Pipeline >Note: If you're running multiple instances of app, ensure to provide `NGINX_HTTPS_PORT` number in the url for the app instance i.e. replace `` with `:` diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pcb-anomaly-detection/get-started/environment-variables.md b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pcb-anomaly-detection/get-started/environment-variables.md index ab74607209..896933ef0c 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pcb-anomaly-detection/get-started/environment-variables.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pcb-anomaly-detection/get-started/environment-variables.md @@ -9,4 +9,4 @@ This reference application's configuration has the following environment variabl In addition to the ones above, the application also uses environment variables of following Microservice: -- [DL Streamer Pipeline Server](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/environment-variables.html) +- [DL Streamer Pipeline Server](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/get-started/environment-variables.html) diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pcb-anomaly-detection/how-to-guides/enable-mlops.md b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pcb-anomaly-detection/how-to-guides/enable-mlops.md index 8afb690728..f93cbd5789 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pcb-anomaly-detection/how-to-guides/enable-mlops.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pcb-anomaly-detection/how-to-guides/enable-mlops.md @@ -95,9 +95,9 @@ With this feature, during runtime, you can download a new model using the micros ![WebRTC streaming](../_assets/webrtc-streaming.png) - ### Downloading model with Model Download + ### Downloading model with Model Download - At this point, user would like to restart the pipeline with a newer model. The new model can bea retrained version of the existing model or a different model altogether. We use [Model Download](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/Overview.md) microservice to help download the model. It supports downloading public models as well as geti models from a running Geti server. To learn more about it, see [here](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/get-started.md). + At this point, user would like to restart the pipeline with a newer model. The new model can bea retrained version of the existing model or a different model altogether. We use [Model Download](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/index.md) microservice to help download the model. It supports downloading public models as well as geti models from a running Geti server. To learn more about it, see [here](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/get-started.md). For our demonstration, we will assume the pcb anomaly model has been retrained and is available for downloaded from a Geti server using the Model Download service. Also, the downloaded location is accessible by the dlstreamer pipeline server. In our example, it is `/tmp/tmp-models`. The `/tmp`dir is already accessible by the sample application. If not, please add it to the `volumes` section of `dlstreamer-pipeline-server service in docker-compose file. @@ -107,7 +107,7 @@ With this feature, during runtime, you can download a new model using the micros curl -k --location -X DELETE https:///api/pipelines/{instance_id} ``` 10. Start a new pipeline with this new model. Before that modify the payload.json to use this new model in `apps/pcb-anomaly-detection/payload.json`. Notice the model path in the payload has changed to the new model. - + ```json [ { diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pcb-anomaly-detection/how-to-guides/manage-pipelines.md b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pcb-anomaly-detection/how-to-guides/manage-pipelines.md index 8f5492caca..f7856b411f 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pcb-anomaly-detection/how-to-guides/manage-pipelines.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/pcb-anomaly-detection/how-to-guides/manage-pipelines.md @@ -33,8 +33,8 @@ The following is an example of the PCB anomaly detection pipeline, which is incl Customize the pipeline according to your needs. For details, see the following DL Streamer Pipeline Server documentation: -- [Launch configurable pipelines](https://docs.openedgeplatform.intel.com/edge-ai-libraries/dlstreamer-pipeline-server/main/user-guide/how-to-launch-configurable-pipelines.html) -- [Autostart pipelines](https://docs.openedgeplatform.intel.com/edge-ai-libraries/dlstreamer-pipeline-server/main/user-guide/how-to-autostart-pipelines.html) +- [Launch configurable pipelines](https://docs.openedgeplatform.intel.com/edge-ai-libraries/dlstreamer-pipeline-server/main/user-guide/how-to-guides/launch-configurable-pipelines.html) +- [Autostart pipelines](https://docs.openedgeplatform.intel.com/edge-ai-libraries/dlstreamer-pipeline-server/main/user-guide/how-to-guides/autostart-pipelines.html) ## Start the Pipeline diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/weld-porosity/get-started/environment-variables.md b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/weld-porosity/get-started/environment-variables.md index ab74607209..896933ef0c 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/weld-porosity/get-started/environment-variables.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/weld-porosity/get-started/environment-variables.md @@ -9,4 +9,4 @@ This reference application's configuration has the following environment variabl In addition to the ones above, the application also uses environment variables of following Microservice: -- [DL Streamer Pipeline Server](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/environment-variables.html) +- [DL Streamer Pipeline Server](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/get-started/environment-variables.html) diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/weld-porosity/how-to-guides/enable-mlops.md b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/weld-porosity/how-to-guides/enable-mlops.md index 8b9933024c..1443185cf4 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/weld-porosity/how-to-guides/enable-mlops.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/weld-porosity/how-to-guides/enable-mlops.md @@ -90,9 +90,9 @@ With this feature, during runtime, you can download a new model using the micros ![WebRTC streaming](../_assets/webrtc-streaming.png) - ### Downloading model with Model Download + ### Downloading model with Model Download - At this point, user would like to restart the pipeline with a newer model. The new model can bea retrained version of the existing model or a different model altogether. We use [Model Download](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/Overview.md) microservice to help download the model. It supports downloading public models as well as geti models from a running Geti server. To learn more about it, see [here](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/get-started.md). + At this point, user would like to restart the pipeline with a newer model. The new model can bea retrained version of the existing model or a different model altogether. We use [Model Download](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/index.md) microservice to help download the model. It supports downloading public models as well as geti models from a running Geti server. To learn more about it, see [here](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/get-started.md). For our demonstration, we will assume the weld porosity model has been retrained and is available for downloaded from a Geti server using the Model Download service. Also, the downloaded location is accessible by the dlstreamer pipeline server. In our example, it is `/tmp/tmp-models`. The `/tmp`dir is already accessible by the sample application. If not, please add it to the `volumes` section of `dlstreamer-pipeline-server service in docker-compose file. @@ -102,7 +102,7 @@ With this feature, during runtime, you can download a new model using the micros curl -k --location -X DELETE https:///api/pipelines/{instance_id} ``` 10. Start a new pipeline with this new model. Before that modify the payload.json to use this new model in `apps/weld-porosity/payload.json`. Notice the model path in the payload has changed to the new model. - + ```json [ { diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/weld-porosity/how-to-guides/manage-pipelines.md b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/weld-porosity/how-to-guides/manage-pipelines.md index 2ebf5735e4..a253a27519 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/weld-porosity/how-to-guides/manage-pipelines.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/weld-porosity/how-to-guides/manage-pipelines.md @@ -34,8 +34,8 @@ The following is an example of the weld porosity classification pipeline, which Customize the pipeline according to your needs. For details, see the following DL Streamer Pipeline Server documentation: -- [Launch configurable pipelines](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/how-to-launch-configurable-pipelines.html) -- [Autostart pipelines](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/how-to-autostart-pipelines.html) +- [Launch configurable pipelines](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/how-to-guides/launch-configurable-pipelines.html) +- [Autostart pipelines](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/how-to-guides/autostart-pipelines.html) ## Start the Pipeline diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/worker-safety-gear-detection/get-started/environment-variables.md b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/worker-safety-gear-detection/get-started/environment-variables.md index ebcad4ac0b..4be530eb1f 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/worker-safety-gear-detection/get-started/environment-variables.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/worker-safety-gear-detection/get-started/environment-variables.md @@ -9,5 +9,5 @@ This reference application's configuration has the following environment variabl In addition to the ones above, the application also uses environment variables of following Microservice: -- [DL Streamer Pipeline Server](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/environment-variables.html) +- [DL Streamer Pipeline Server](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/get-started/environment-variables.html) diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/worker-safety-gear-detection/how-to-guides/enable-mlops.md b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/worker-safety-gear-detection/how-to-guides/enable-mlops.md index 39b7fae398..ee591acf98 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/worker-safety-gear-detection/how-to-guides/enable-mlops.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/worker-safety-gear-detection/how-to-guides/enable-mlops.md @@ -82,9 +82,9 @@ With this feature, during runtime, you can download a new model using the micros ![WebRTC streaming](../_assets/webrtc-streaming.png) - ### Downloading model with Model Download + ### Downloading model with Model Download - At this point, user would like to restart the pipeline with a newer model. The new model can bea retrained version of the existing model or a different model altogether. We use [Model Download](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/Overview.md) microservice to help download the model. It supports downloading public models as well as geti models from a running Geti server. To learn more about it, see [here](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/get-started.md). + At this point, user would like to restart the pipeline with a newer model. The new model can bea retrained version of the existing model or a different model altogether. We use [Model Download](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/index.md) microservice to help download the model. It supports downloading public models as well as geti models from a running Geti server. To learn more about it, see [here](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/model-download/docs/user-guide/get-started.md). For our demonstration, we will assume the worker safety gear detection model has been retrained and is available for downloaded from a Geti server using the Model Download service. Also, the downloaded location is accessible by the dlstreamer pipeline server. In our example, it is `/tmp/tmp-models`. The `/tmp`dir is already accessible by the sample application. If not, please add it to the `volumes` section of `dlstreamer-pipeline-server service in docker-compose file. @@ -94,7 +94,7 @@ With this feature, during runtime, you can download a new model using the micros curl -k --location -X DELETE https:///api/pipelines/{instance_id} ``` 10. Start a new pipeline with this new model. Before that modify the payload.json to use this new model in `apps/worker-safety-gear-detection/payload.json`. Notice the model path in the payload has changed to the new model. - + ```json [ { diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/worker-safety-gear-detection/how-to-guides/manage-pipelines.md b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/worker-safety-gear-detection/how-to-guides/manage-pipelines.md index 8f2ed4f3ff..c27f797224 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/worker-safety-gear-detection/how-to-guides/manage-pipelines.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/docs/user-guide/worker-safety-gear-detection/how-to-guides/manage-pipelines.md @@ -33,8 +33,8 @@ The following is an example of the Worker Safety Gear Detection pipeline, which Customize the pipeline according to your needs. For details, see the following DL Streamer Pipeline Server documentation: -- [Launch configurable pipelines](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/how-to-launch-configurable-pipelines.html) -- [Autostart pipelines](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/how-to-autostart-pipelines.html) +- [Launch configurable pipelines](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/how-to-guides/launch-configurable-pipelines.html) +- [Autostart pipelines](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/dlstreamer-pipeline-server/how-to-guides/autostart-pipelines.html) ## Start the Pipeline diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/pallet-defect-detection/README.md b/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/pallet-defect-detection/README.md index e4b5235263..bcbd099d8b 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/pallet-defect-detection/README.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/pallet-defect-detection/README.md @@ -2,7 +2,7 @@ ## Prerequisites -- [System Requirements](../get-started/system-requirements.md) +- [System Requirements](../../../docs/user-guide/pallet-defect-detection/get-started/system-requirements.md) - K8s installation on single or multi node must be done as pre-requisite to continue the following deployment. Note: The kubernetes cluster is set up with `kubeadm`, `kubectl` and `kubelet` packages on single and multi nodes with `v1.30.2`. Refer to tutorials online to setup kubernetes cluster on the web with host OS as ubuntu 22.04 and/or ubuntu 24.04. - For helm installation, refer to [helm website](https://helm.sh/docs/intro/install/) diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/pcb-anomaly-detection/README.md b/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/pcb-anomaly-detection/README.md index 04ea04f02c..41339f7416 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/pcb-anomaly-detection/README.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/pcb-anomaly-detection/README.md @@ -2,7 +2,7 @@ ## Prerequisites -- [System Requirements](./system-requirements.md) +- [System Requirements](../../../docs/user-guide/pcb-anomaly-detection/get-started/system-requirements.md) - K8s installation on single or multi node must be done as pre-requisite to continue the following deployment. Note: The kubernetes cluster is set up with `kubeadm`, `kubectl` and `kubelet` packages on single and multi nodes with `v1.30.2`. Refer to tutorials online to setup kubernetes cluster on the web with host OS as ubuntu 22.04 and/or ubuntu 24.04. - For helm installation, refer to [helm website](https://helm.sh/docs/intro/install/) diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/weld-porosity/README.md b/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/weld-porosity/README.md index 9b771f88de..f6c4831a62 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/weld-porosity/README.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/weld-porosity/README.md @@ -2,7 +2,7 @@ ## Prerequisites -- [System Requirements](../get-started/system-requirements.md) +- [System Requirements](../../../docs/user-guide/weld-porosity/get-started/system-requirements.md) - K8s installation on single or multi node must be done as pre-requisite to continue the following deployment. Note: The kubernetes cluster is set up with `kubeadm`, `kubectl` and `kubelet` packages on single and multi nodes with `v1.30.2`. Refer to tutorials online to setup kubernetes cluster on the web with host OS as ubuntu 22.04 and/or ubuntu 24.04. - For helm installation, refer to [helm website](https://helm.sh/docs/intro/install/) diff --git a/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/worker-safety-gear-detection/README.md b/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/worker-safety-gear-detection/README.md index 2c875a77fb..4f6f23cb8c 100644 --- a/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/worker-safety-gear-detection/README.md +++ b/manufacturing-ai-suite/industrial-edge-insights-vision/helm/apps/worker-safety-gear-detection/README.md @@ -2,7 +2,7 @@ ## Prerequisites -- [System Requirements](../get-started/system-requirements.md) +- [System Requirements](../../../docs/user-guide/worker-safety-gear-detection/get-started/system-requirements.md) - K8s installation on single or multi node must be done as pre-requisite to continue the following deployment. Note: The kubernetes cluster is set up with `kubeadm`, `kubectl` and `kubelet` packages on single and multi nodes with `v1.30.2`. Refer to tutorials online to setup kubernetes cluster on the web with host OS as ubuntu 22.04 and/or ubuntu 24.04. - For helm installation, refer to [helm website](https://helm.sh/docs/intro/install/) diff --git a/metro-ai-suite/metro-sdk-manager/docs/user-guide/metro-gen-ai-sdk/get-started.md b/metro-ai-suite/metro-sdk-manager/docs/user-guide/metro-gen-ai-sdk/get-started.md index cd7239a210..b0b68f57a6 100644 --- a/metro-ai-suite/metro-sdk-manager/docs/user-guide/metro-gen-ai-sdk/get-started.md +++ b/metro-ai-suite/metro-sdk-manager/docs/user-guide/metro-gen-ai-sdk/get-started.md @@ -91,7 +91,7 @@ http://localhost:8101 - [Chat Q&A](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/chat-question-and-answer/index.html) - [Audio Analyzer](https://docs.openedgeplatform.intel.com/dev/edge-ai-libraries/audio-analyzer/index.html) \- Comprehensive documentation for multimodal audio processing capabilities -- [Document Ingestion - pgvector](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/document-ingestion/pgvector/docs/get-started.md) +- [Document Ingestion - pgvector](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/document-ingestion/pgvector/docs/user-guide/get-started.md) \- Vector database integration and document processing workflows - [Multimodal Embedding Serving](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/microservices/multimodal-embedding-serving/docs/user-guide/Overview.md) \- Embedding generation service architecture and API documentation diff --git a/metro-ai-suite/metro-sdk-manager/docs/user-guide/visual-ai-demo-kit/tutorial-3.md b/metro-ai-suite/metro-sdk-manager/docs/user-guide/visual-ai-demo-kit/tutorial-3.md index ee355bd48c..7533ac2c2f 100644 --- a/metro-ai-suite/metro-sdk-manager/docs/user-guide/visual-ai-demo-kit/tutorial-3.md +++ b/metro-ai-suite/metro-sdk-manager/docs/user-guide/visual-ai-demo-kit/tutorial-3.md @@ -119,6 +119,6 @@ After completing this tutorial, you should have: ## Supporting Resources -- [Grafana HTML Panel Documentation](https://grafana.com/docs/grafana/latest/panels/visualizations/text/) +- [Grafana HTML Panel Documentation](https://grafana.com/docs/grafana/latest/visualizations/panels-visualizations/visualizations/) - [MQTT Data Source Configuration](https://grafana.com/docs/grafana/latest/datasources/) - [Dashboard Best Practices](https://grafana.com/docs/grafana/latest/best-practices/) diff --git a/metro-ai-suite/smart-nvr/docs/user-guide/get-started/system-requirements.md b/metro-ai-suite/smart-nvr/docs/user-guide/get-started/system-requirements.md index ee3c366350..154a3ee8eb 100644 --- a/metro-ai-suite/smart-nvr/docs/user-guide/get-started/system-requirements.md +++ b/metro-ai-suite/smart-nvr/docs/user-guide/get-started/system-requirements.md @@ -9,7 +9,7 @@ The base requirements for Smart NVR is dependent on the respective video analyti - Frigate NVR: Refer to Frigate documentation, specifically the section that maps to [OpenVINO](https://docs.frigate.video/frigate/hardware#openvino). - Video Search and Summary: Refer to the - [system requirements](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/sample-applications/video-search-and-summarization/docs/user-guide/system-requirements.md) + [system requirements](https://github.com/open-edge-platform/edge-ai-libraries/blob/main/sample-applications/video-search-and-summarization/docs/user-guide/get-started/system-requirements.md) page of the sample application. ## Compatibility Notes diff --git a/robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/developer_kit/clearpath-jackal-robot.rst b/robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/developer_kit/clearpath-jackal-robot.rst index 9769fa3835..02345f86b9 100644 --- a/robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/developer_kit/clearpath-jackal-robot.rst +++ b/robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/developer_kit/clearpath-jackal-robot.rst @@ -6,7 +6,7 @@ developed and distributed by Clearpath Robotics, a Rockwell Automation company. Detailed information about this robot is provided by Clearpath Robotics: * `Jackal Unmanned Ground Vehicle `_ product page -* `Jackal User Manual `_ +* `Jackal User Manual `_ The following pages describe how the Autonomous Mobile Robot can be used with a Clearpath Robotics Jackal robot. diff --git a/robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/navigation/adbscan/adbscan_aaeon_robot.md b/robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/navigation/adbscan/adbscan_aaeon_robot.md index b0200b4fd5..11024d7f19 100644 --- a/robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/navigation/adbscan/adbscan_aaeon_robot.md +++ b/robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/navigation/adbscan/adbscan_aaeon_robot.md @@ -1,7 +1,7 @@ # ADBSCAN on AAEON Robot Kit This tutorial describes how to run the ADBSCAN algorithm on the real robot -[UP Xtreme i11 AAEON Robot Kit](https://up-shop.org/up-xtreme-i11-robotic-kit.html) using the Intel® RealSense™ camera input. +[AAEON UP Xtreme i11 Robotic Kit](https://up-board.org/up-ai-dev-kit/) using the Intel® RealSense™ camera input. During the execution of the program the ADBSCAN algorithm detects objects, and draws them in rviz. Then, the FastMapping algorithm uses data from the ADBSCAN to generate a 2D Map of the environment around. User can use the default setup to move robot via gamepad or keyboard, so the 3D-camera on the robot can scan surroundings around. @@ -198,4 +198,4 @@ User can use the default setup to move robot via gamepad or keyboard, so the 3D- ## Troubleshooting -For general robot issues, refer to [Troubleshooting](../robot-tutorials-troubleshooting.md). +For general robot issues, refer to [Troubleshooting](../../robot-tutorials-troubleshooting.md). diff --git a/robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/robot-tutorials-troubleshooting.md b/robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/robot-tutorials-troubleshooting.md index 394d9a148d..26862ab4a5 100644 --- a/robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/robot-tutorials-troubleshooting.md +++ b/robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/robot-tutorials-troubleshooting.md @@ -20,7 +20,7 @@ export ROS_DOMAIN_ID= The `ROS_DOMAIN_ID` should be an integer between 0 and 101 and it should be the same for all the nodes launched for a particular use case. If you run only one use case at a time, you can set this variable in your `.bashrc` file, -as described in the [prepare-ros-environment](../../gsg_robot/prepare-system.md#prepare-your-ros-2-environment) +as described in the [prepare-ros-environment](../../../robotics/gsg_robot/index.md#21-prepare-your-ros-2-environment) section. ## Troubleshooting AAEON Motor Control Board Issues @@ -108,7 +108,7 @@ To resolve this issue, set the display scale mode to 100%: > in the same network and have the same ROS_DOMAIN_ID set. To prepare the development system follow the instructions to -[Prepare the Target System](../../gsg_robot/prepare-system.md). +[Prepare the Target System](../../../../robot-vision-control/docs/source/getstarted/prepare_system.md). **Jazzy** @@ -183,7 +183,7 @@ kernel mode driver in Linux Kernel 6.7.5 or later. For Intel® Core™ Ultra Processors, the recommended operating system for the Autonomous Mobile Robot is the [Ubuntu OS version 22.04 LTS (Jammy Jellyfish)](https://releases.ubuntu.com/22.04) Desktop image, as described in -[Prepare the Target System](../../gsg_robot/prepare-system.md). +[Prepare the Target System](../../../../robot-vision-control/docs/source/getstarted/prepare_system.md). Since this version of the Canonical Ubuntu operating system uses a Linux Kernel 6.8, this incompatibility will have an impact if you use the Autonomous Mobile Robot on an Intel® Core™ Ultra Processor.