@@ -74,6 +74,16 @@ sudo apt install -y ros-humble-librealsense2-tools
7474:::
7575::::
7676
77+ ### Install UV
78+ ``` bash
79+ curl -LsSf https://astral.sh/uv/install.sh | sh
80+
81+ ```
82+ ### Source uv
83+ ``` bash
84+ source $HOME /.local/bin/env
85+ ```
86+
7787### Load the Intel IPU Driver ###
7888::::{tab-set}
7989:::{tab-item} ** IPU7**
@@ -119,12 +129,22 @@ sudo apt install -y ros-humble-pyrealsense2-ai-demo
119129:::
120130::::
121131
122- > ** Note:** The ` pyrealsense2-ai-demo ` installation will also do the following:
123- >
124- > - installs all the run-time python dependency packages,
125- > - downloads Ultralytics YOLOv8 model files and generate the models.
126- >
127- > The installation will run for 25-30 minutes and consumes approximately 2GB of the disk space.
132+ ### Setup uv venv
133+ Go to /opt/ros/` ros-distro ` /share/pyrealsense2-ai-demo
134+
135+ ``` bash
136+ uv sync
137+ ```
138+
139+ Once the virtual env is setup download the yolo model
140+
141+ ``` bash
142+ source .venv/bin/activate
143+ ./scripts/generate_ai_models.sh
144+ ```
145+
146+ This will take couple minutes
147+
128148
129149### Run the tutorial
130150
@@ -135,9 +155,7 @@ Run the below commands to start the tutorial.
135155:sync: jazzy
136156
137157``` bash
138- # Activate the pyrealsense2-ai-demo python environment
139- . /opt/ros/jazzy/share/pyrealsense2-ai-demo/venv/bin/activate
140-
158+ cd /opt/ros/jazzy/share/pyrealsense2-ai-demo
141159# Source the ros2 jazzy
142160source /opt/ros/jazzy/setup.bash
143161```
@@ -146,29 +164,26 @@ source /opt/ros/jazzy/setup.bash
146164
147165``` bash
148166# Run the pyrealsense2-ai-demo tutorial for four camera input streams
149- python3 /opt/ros/jazzy/bin/ pyrealsense2_ai_demo_launcher.py --config=/opt/ros/jazzy/share/pyrealsense2-ai-demo/ config/config_ros2_v4l2_rs-color-0_3.js
167+ uv run src/ pyrealsense2_ai_demo_launcher.py --config=config/config_ros2_v4l2_rs-color-0_3.js
150168```
151169
152170** D3CMCXXX-115-084:**
153171
154172``` bash
155173# Run the pyrealsense2-ai-demo tutorial for four camera input streams
156- python3 /opt/ros/jazzy/bin/ pyrealsense2_ai_demo_launcher.py --config=/opt/ros/jazzy/share/pyrealsense2-ai-demo/ config/config_isx031_4cameras.js
174+ uv run src/ pyrealsense2_ai_demo_launcher.py --config=config/config_isx031_4cameras.js
157175```
158176
159177:::
160178:::{tab-item} ** Humble**
161179:sync: humble
162180
163181``` bash
164- # Activate the pyrealsense2-ai-demo python environment
165- . /opt/ros/humble/share/pyrealsense2-ai-demo/venv/bin/activate
166-
167182# Source the ros2 humble
168183source /opt/ros/humble/setup.bash
169184
170- # Run the pyrealsense2-ai-demo tutorial for four camera input streams
171- python3 /opt/ros/humble/bin/ pyrealsense2_ai_demo_launcher.py --config=/opt/ros/humble/share/pyrealsense2-ai-demo/ config/config_ros2_v4l2_rs-color-0_3.js
185+ # Run the pyrealsense2-ai-demo tutorial for four camera input streams (you might have to change the config to match to the correct /dev/video*)
186+ uv run src/ pyrealsense2_ai_demo_launcher.py --config=config/config_ros2_v4l2_rs-color-0_3.js
172187```
173188
174189:::
0 commit comments