You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/navigation/follow_me/Tutorials/followme-on-aaeon.rst
+28-6Lines changed: 28 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,19 @@
1
1
Follow-me with ADBSCAN on Aaeon Robot
2
2
================================================
3
3
4
-
This tutorial provides instructions for running the ADBSCAN-based Follow-me algorithm from |p_amr| using |realsense| camera input. Validation of the the algorithm was performed on a custom Aaeon robot.
5
-
The |realsense| camera publishes to ``/camera/depth/color/points`` topic. The `adbscan_sub_node` subscribes to the corresponding topic, detects the obstacle array, computes the robot's velocity and publishes to the ``/cmd_vel`` topic of type `geometry_msg/msg/Twist`. This ``twist`` message consists of the updated angular and linear velocity of the robot to follow the target, which can be subsequently subscribed by a robot-driver.
4
+
This tutorial provides instructions for running the ADBSCAN-based Follow-me algorithm from |p_amr| using |realsense| camera input.
5
+
Validation of the the algorithm was performed on a custom Aaeon robot.
6
+
The |realsense| camera publishes to ``/camera/depth/color/points`` topic. The `adbscan_sub_node` subscribes to the corresponding topic,
7
+
detects the obstacle array, computes the robot's velocity and publishes to the ``/cmd_vel`` topic of type `geometry_msg/msg/Twist`.
8
+
This ``twist`` message consists of the updated angular and linear velocity of the robot to follow the target, which can be subsequently subscribed by a robot-driver.
9
+
10
+
11
+
Prerequisites:
12
+
13
+
- Assemble your robotic kit following the instructions provided by AAEON.
14
+
15
+
- Ensure the :doc:`system is set up correctly <../../../../../gsg_robot/prepare-system>`.
16
+
6
17
7
18
Getting Started
8
19
----------------
@@ -18,21 +29,32 @@ Install the ``ros-humble-follow-me-tutorial`` |deb_pack| from the |intel| |p_amr
18
29
sudo apt update
19
30
sudo apt install ros-humble-follow-me-tutorial
20
31
32
+
Calibrate the robot
33
+
^^^^^^^^^^^^^^^^^^^^^^^
34
+
Please perform IMU calibration of the robot, launch script below:
After executing the above command, you can observe that the robot detecting the target within a tracking radius (~0.5 - 0.7 m) and subsequently following the moving target person.
51
+
After executing the above command, you can observe that the robot detecting the target within a tracking radius
52
+
(~0.5 - 1.5 m; `min_dist` and `max_dist` are set in `/opt/ros/humble/share/tutorial_follow_me/params/followme_adbscan_RS_params.yaml`)
53
+
and subsequently following the moving target person.
32
54
33
55
.. note::
34
56
35
-
There are reconfigurable parameters in `/opt/ros/humble/share/tutorial-follow-me/params/followme_adbscan_RS_params.yaml`
57
+
There are reconfigurable parameters in `/opt/ros/humble/share/tutorial_follow_me/params/followme_adbscan_RS_params.yaml`
36
58
file. The user can modify the parameters depending on the respective robot, sensor configuration and environments (if required) before running the tutorial.
37
59
Find a brief description of the parameters in the following table.
Copy file name to clipboardExpand all lines: robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/navigation/follow_me/Tutorials/followme-on-clearpathjackal.rst
+6-4Lines changed: 6 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,9 @@ Follow-me with ADBSCAN on |clearpath_robotics| |jackal| Robot
This tutorial provides instructions for running the ADBSCAN-based Follow-me algorithm from |p_amr| using |realsense| camera input when using a |clearpath_robotics| |jackal| robot.
5
-
The |realsense| camera publishes to ``/camera/depth/color/points`` topic. The ``adbscan_sub_node`` subscribes to the corresponding topic, detects the obstacle array, computes the robot's velocity and publishes to the ``/cmd_vel`` topic of type `geometry_msg/msg/Twist`. This ``twist`` message consists of the updated angular and linear velocity of the robot to follow the target, which can be subsequently subscribed by a robot-driver.
5
+
The |realsense| camera publishes to ``/camera/depth/color/points`` topic. The ``adbscan_sub_node`` subscribes to the corresponding topic,
6
+
detects the obstacle array, computes the robot's velocity and publishes to the ``/cmd_vel`` topic of type `geometry_msg/msg/Twist`.
7
+
This ``twist`` message consists of the updated angular and linear velocity of the robot to follow the target, which can be subsequently subscribed by a robot-driver.
6
8
7
9
Getting Started
8
10
----------------
@@ -21,18 +23,18 @@ Install the ``ros-humble-follow-me-tutorial`` |deb_pack| from the |lp_amr| APT r
21
23
Run Demo
22
24
----------------
23
25
24
-
Run the following script to launch the Follow-me application tutorial on the |jackal| robot.
26
+
To launch the Follow-me application tutorial on the |jackal| robot, use the following ROS 2 launch file.
After starting the script, the robot should begin searching for trackable objects in its initial detection radius (defaulting to around 0.5m), and then following acquired targets as they move from the initial target location.
32
34
33
35
.. note::
34
36
35
-
There are reconfigurable parameters in ``/opt/ros/humble/share/tutorial-follow-me/params/followme_adbscan_RS_params.yaml``.
37
+
There are reconfigurable parameters in ``/opt/ros/humble/share/tutorial_follow_me/params/followme_adbscan_RS_params.yaml``.
36
38
You can modify the parameters depending on the respective robot, sensor configuration and environments (if required) before running the tutorial.
37
39
Find a brief description of the parameters in the following table.
Copy file name to clipboardExpand all lines: robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/navigation/follow_me/Tutorials/followme-with-gesture-on-aaeon.rst
+35-10Lines changed: 35 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,23 +6,34 @@ Follow-me with ADBSCAN and Gesture-based Control on Aaeon Robot
6
6
This tutorial demonstrates the Follow-me algorithm with gesture, where the robot follows a target person in real time.
7
7
The movement of the robot can be controlled by the person's position (relative to the robot) as well as the hand gestures.
8
8
This tutorial is demonstrated on Aaeon robot using 2 front-mounted |realsense| cameras: camera 1 and camera 2.
9
-
Camera 1 takes the point cloud data as inputs and passes it through |intel|-patented object detection algorithm, namely Adaptive DBSCAN, to detect the position of the target person.
9
+
Camera 1 takes the point cloud data as inputs and passes it through |intel|-patented object detection algorithm, namely Adaptive DBSCAN,
10
+
to detect the position of the target person.
10
11
Camera 2 is positioned at a certain height for capturing the RGB images of the target's hand gestures.
11
-
This RGB image is passed through a deep learning-based gesture recognition pipeline, called `Mediapipe Hands Framework <https://mediapipe.readthedocs.io/en/latest/solutions/hands.html>`__, to detect the gesture category.
12
+
This RGB image is passed through a deep learning-based gesture recognition pipeline,
13
+
called `Mediapipe Hands Framework <https://mediapipe.readthedocs.io/en/latest/solutions/hands.html>`__, to detect the gesture category.
12
14
The motion commands for the robot are published to ``twist`` topic based on these two outputs: person's position and gesture category.
13
15
16
+
Prerequisites:
17
+
18
+
- Assemble your robotic kit following the instructions provided by AAEON.
19
+
20
+
- Ensure the :doc:`system is set up correctly <../../../../../gsg_robot/prepare-system>`.
21
+
22
+
14
23
The two conditions required to start the robot's movement are as follows:
15
24
16
-
- The target person will be within the tracking radius (a reconfigurable parameter in the parameter file in `/opt/ros/humble/share/tutorial-follow-me-w-gesture/params/followme_adbscan_RS_params.yaml`) of the robot.
25
+
- The target person will be within the tracking radius
26
+
(a reconfigurable parameter in the parameter file in `/opt/ros/humble/share/tutorial_follow_me_w_gesture/params/followme_adbscan_RS_params.yaml`) of the robot.
17
27
18
28
- The detected gesture of the target is ``thumbs up``.
19
29
20
30
Once the starting criteria are met, the robot keeps following the target unless one of the below stopping conditions holds true:
21
31
22
-
- The target moves to a distance greater than the tracking radius (a reconfigurable parameter in the parameter file in `/opt/ros/humble/share/tutorial-follow-me-w-gesture/params/followme_adbscan_RS_params.yaml`).
32
+
- The target moves to a distance greater than the tracking radius
33
+
(a reconfigurable parameter in the parameter file in `/opt/ros/humble/share/tutorial_follow_me_w_gesture/params/followme_adbscan_RS_params.yaml`).
23
34
24
35
- The detected gesture is ``thumbs down``.
25
-
36
+
26
37
Getting Started
27
38
----------------
28
39
@@ -69,28 +80,42 @@ Check the Serial number
69
80
70
81
71
82
Serial Number is the one which has to be used while launching the demo in below step
72
-
83
+
84
+
Calibrate the robot
85
+
^^^^^^^^^^^^^^^^^^^^^^^
86
+
Please perform IMU calibration of the robot, launch script below:
Execute the following script to launch the Follow-me application tutorial with gesture on the Aaeon robot:
97
+
To launch the Follow-me application tutorial with gesture on the Aaeon robot, use the following ROS 2 launch file.
98
+
78
99
79
100
.. code-block::
80
101
81
102
source /opt/ros/humble/setup.bash
82
-
/opt/ros/humble/share/tutorial-follow-me-w-gesture/scripts/aaeon-follow-me-w-gesture.sh <Camera1 Serial number> < Camera2 Serial Number>
103
+
ros2 launch tutorial_follow_me_w_gesture aaeon_gesture_launch.py <Camera1 Serial number> < Camera2 Serial Number>
83
104
84
105
85
106
Camera1 serial number : Camera which is mounted to the bottom (used for tracking the target).
86
107
87
108
Camera2 serial Number : Camera mounted on the top (used for gesture recognition).
88
109
89
-
After executing the above command, you can observe that the robot is locating the target within a tracking radius (~0.5 - 0.7 m) and subsequently, following the moving target person as soon as he/she shows ``thumbs up``. The robot will stop as soon as ``thumbs down`` is showed or the target person moves away from the tracking radius.
110
+
After executing the above command, you can observe that the robot is locating the target within a tracking radius
111
+
(~0.5 - 1.5 m; `min_dist` and `max_dist` are set in `/opt/ros/humble/share/tutorial_follow_me/params/followme_adbscan_RS_params.yaml`) and subsequently,
112
+
following the moving target person as soon as he/she shows ``thumbs up``.
113
+
The robot will stop as soon as ``thumbs down`` is showed or the target person moves away from the tracking radius.
90
114
91
115
.. note::
92
116
93
-
There are reconfigurable parameters in `/opt/ros/humble/share/tutorial-follow-me-w-gesture/params` directory for |realsense| camera (`followme_adbscan_RS_params.yaml`). The user can modify parameters depending on the respective robot, sensor configuration and environments (if required) before running the tutorial.
117
+
There are reconfigurable parameters in `/opt/ros/humble/share/tutorial_follow_me_w_gesture/params` directory for |realsense| camera (`followme_adbscan_RS_params.yaml`).
118
+
The user can modify parameters depending on the respective robot, sensor configuration and environments (if required) before running the tutorial.
94
119
Find a brief description of the parameters in the following table:
Copy file name to clipboardExpand all lines: robotics-ai-suite/docs/robotics/dev_guide/tutorials_amr/navigation/follow_me/Tutorials/followme-with-gesture-on-clearpathjackal.rst
+2-3Lines changed: 2 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -77,7 +77,7 @@ Execute the following script to launch Follow-Me with Gesture on the |clearpath_
77
77
.. code-block:: bash
78
78
79
79
source /opt/ros/humble/setup.bash
80
-
/opt/ros/humble/share/tutorial-follow-me-w-gesture/scripts/jackal-follow-me-w-gesture.sh<Camera Serial Number>
80
+
ros2 launch tutorial_follow_me_w_gesture jackal_gesture_launch.py<Camera Serial Number>
81
81
82
82
83
83
<Camera Serial Number>: Use the serial number returned when using `rs-enumerate-devices`. Note that the output of other programs like `lsusb` might return an incorrect serial number.
@@ -86,7 +86,7 @@ After starting the script, the robot should begin searching for trackable object
86
86
87
87
.. note::
88
88
89
-
There are reconfigurable parameters in ``/opt/ros/humble/share/tutorial-follow-me-w-gesture/params`` directory for the |realsense| camera (`followme_adbscan_RS_params.yaml`). You can modify parameters depending on the respective robot, sensor configuration and environments (if required) before running the tutorial.
89
+
There are reconfigurable parameters in ``/opt/ros/humble/share/tutorial_follow_me_w_gesture/params`` directory for the |realsense| camera (`followme_adbscan_RS_params.yaml`). You can modify parameters depending on the respective robot, sensor configuration and environments (if required) before running the tutorial.
90
90
Find a brief description of the parameters in the following table:
91
91
92
92
.. list-table:: Configurable Parameters
@@ -145,4 +145,3 @@ Troubleshooting
145
145
- If the motor controller board does not start, restart the robot.
146
146
147
147
- For general robot issues, go to: :doc:`../../../../../dev_guide/tutorials_amr/robot-tutorials-troubleshooting`.
0 commit comments