Skip to content

Commit 6d505a4

Browse files
committed
update docs
1 parent 9c5949a commit 6d505a4

2 files changed

Lines changed: 2 additions & 17 deletions

File tree

microservices/dlstreamer-pipeline-server/docs/user-guide/how-to-deploy-with-helm.md

Lines changed: 1 addition & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -56,15 +56,7 @@ Update the below fields in `values.yaml` file in the helm chart
5656
### Run default sample
5757

5858
Once the pods are up, we will send a pipeline request to DL Streamer Pipeline Server to run a detection model on a warehouse video.
59-
60-
Copy the resources such as video and model from local directory to the to the `dlstreamer-pipeline-server` pod to make them available for launching pipeline.
61-
```sh
62-
POD_NAME=$(kubectl get pods -n apps -o jsonpath='{.items[*].metadata.name}' | tr ' ' '\n' | grep dlstreamer-pipeline-server | head -n 1)
63-
64-
kubectl cp <edge-ai-libraries/microservices/dlstreamer-pipeline-server>/resources/models/geti/ $POD_NAME:/home/pipeline-server/resources/models/geti/ -c dlstreamer-pipeline-server -n apps
65-
66-
kubectl cp <edge-ai-libraries/microservices/dlstreamer-pipeline-server>/resources/videos/warehouse.avi $POD_NAME:/home/pipeline-server/resources/videos/ -c dlstreamer-pipeline-server -n apps
67-
```
59+
The resources such as video and model are copied into `dlstreamer-pipeline-server` pod by `initContainers`.
6860

6961
We will send the below curl request to run the inference.
7062
It comprises of a source file path which is `warehouse.avi`, a destination, with metadata directed to a json fine in `/tmp/resuts.jsonl` and frames streamed over RTSP with id `pallet_defect_detection`. Additionally, we will also provide the GETi model path that would be used for detecting defective boxes on the video file.

microservices/dlstreamer-pipeline-server/helm/README.md

Lines changed: 1 addition & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -14,14 +14,7 @@
1414
`helm install dlsps . -n apps --create-namespace`
1515
- Check if Deep Learning Streamer Pipeline Server is running fine
1616
`kubectl get pods --namespace apps`and monitor its logs using `kubectl logs -f <pod_name> -n apps`
17-
- Copy the resources such as video and model from local directory to the to the `dlstreamer-pipeline-server` pod to make them available for launching pipeline.
18-
```sh
19-
POD_NAME=$(kubectl get pods -n apps -o jsonpath='{.items[*].metadata.name}' | tr ' ' '\n' | grep dlstreamer-pipeline-server | head -n 1)
20-
21-
kubectl cp ../resources/models/geti/ $POD_NAME:/home/pipeline-server/resources/models/geti/ -c dlstreamer-pipeline-server -n apps
22-
23-
kubectl cp ../resources/videos/warehouse.avi $POD_NAME:/home/pipeline-server/resources/videos/ -c dlstreamer-pipeline-server -n apps
24-
```
17+
- The resources such as video and model are copied into `dlstreamer-pipeline-server` pod by `initContainers`.
2518
- Send the curl command to start the pallet defect detection pipeline
2619
``` sh
2720
curl http://localhost:30007/pipelines/user_defined_pipelines/pallet_defect_detection -X POST -H 'Content-Type: application/json' -d '{

0 commit comments

Comments
 (0)