This is the repository for MRAC 24/25 Workshop 2.2, which focuses on advanced 3D scanning and spatial analysis of building environments using a combination of photogrammetry, 3D LiDAR, robot platform, robot dog, and drone. This project emphasizes the use of robotic systems to autonomously or via teleoperation gather and analyze spatial data in different environments. Each system contributes to separate parts of the scanning and analysis process, providing a comprehensive approach to spatial data collection and 3D modeling.
- Photogrammetry: High-resolution 3D model generation from photographs.
- 3D LiDAR: Laser-based scanning for precise spatial mapping.
- Robot Platforms (Husky A200): Ground-based robotic platforms equipped for 3D scanning and data collection.
- Robot Dog(unitree GO2): Agile robots designed for scanning in challenging environments.
- Drone: Aerial data collection to complement ground-based scans.
- ROS (Robot Operating System): Used for communication and control of the rover and robot dog.
- Rover and Robot Dog: Teleoperated robotic platforms, using ROS for communication and control, designed for navigating and scanning different environments.
- Photogrammetry & LiDAR Systems: These technologies operate independently to gather detailed spatial data from different areas.
- Data Analysis: Spatial data is analyzed separately for each scanning method, enabling a comprehensive understanding of the environment.
Required Operating Systems: Ubuntu 22.04, Windows10/11
Foxglove Studio Installation
sudo snap install foxglove-studio
SSH Server
sudo apt install openssh-server
PCL Tools
sudo apt install pcl-tools
-
Connect to the Husky robot:
- Connect to the iaac_husky hotspot (WIFI password:
EnterIaac22@). - Open a terminal on your computer and SSH into the Husky robot (user password:
iaac):
- Connect to the iaac_husky hotspot (WIFI password:
-
Using tmux for terminal management:
- Start a new tmux session:
This command will start a new tmux session named
tmux new -s husky
husky.
- Start a new tmux session:
-
Split the tmux terminal:
- Once inside tmux, you can split the terminal window to launch multiple processes concurrently. To split the tmux terminal:
- Press
Ctrl + B, then release both keys and press%to split the terminal vertically. - To split horizontally, press
Ctrl + B, then release both keys and press"(double quote).
- Press
- Once inside tmux, you can split the terminal window to launch multiple processes concurrently. To split the tmux terminal:
-
Launch the ROS2 nodes in separate tmux panes:
Terminal 1 (Mobile Base):
- In the first tmux pane, launch the mobile base:
ros2 launch /etc/clearpath/platform/launch/platform-service.launch.py
Terminal 2 (LiDAR):
- In the second tmux pane, launch the LiDAR:
ros2 launch livox_ros_driver2 msg_MID360_launch.py
Terminal 3 (Mapping):
- In the third tmux pane, launch the mapping node:
ros2 launch fast_lio mapping.launch.py
Terminal 4 (Foxglove Bridge):
- In the fourth tmux pane, launch the Foxglove bridge:
ros2 launch foxglove_bridge foxglove_bridge_launch.xml
- In the first tmux pane, launch the mobile base:
-
Open Foxglove Studio:
- Open Foxglove Studio and set the web address to
10.42.0.1. - Load the panel from this repository to visualize the data.
- Open Foxglove Studio and set the web address to
-
Clone the repository:
- Clone the MRAC-robot-spatial-analysis repository to your system.
-
Build the Docker image:
- Note: You only need to build the image when using the Unitree Go2 robot.
- Navigate to the
go2_robotpackage directory and build the Docker image:cd MRAC-robot-spatial-analysis/go2_robot .docker/build_image.sh
-
Run the Docker image:
- To run the image, use the following command:
.docker/run_user.sh
- Or, if you're using an Nvidia graphic card, run the image with the Nvidia runtime:
.docker/run_user_nvidia.sh
- To run the image, use the following command:
-
Change the folder ownership:
- To avoid permission issues, change the ownership of the workspace folder:
sudo chown -R YOUR_USER_NAME /dev_ws
- To avoid permission issues, change the ownership of the workspace folder:
-
Terminal Setup for Go2 Robot:
Terminal 1 (Go2 Bringup):
- In the first terminal, bring up the robot dog:
ros2 launch go2_bringup go2.launch.py
Terminal 2 (RQT Visualization):
- In the second terminal, use RQT to visualize the camera image or other sensor data:
You can use
rqt
rqt_image_viewto see the camera feed or other visualization plugins for other sensors.
Terminal 3 (Teleoperation):
- In the third terminal, use the keyboard teleoperation for controlling the robot:
This will allow you to control the Unitree Go2 robot using the keyboard.
ros2 run teleop_twist_keyboard teleop_twist_keyboard
Terminal 4 (Map Saver Service):
- In the fourth terminal, use the map saver service to save the generated voxel map:
ros2 run go2_interfaces voxel_map_saver
OPTIONAL Terminal 5 (Rosbag Recording):
- If you want to record the data to a rosbag, use the following command to record the selected topics:
Warning: Make sure your computer has enough space for recording, and do not record for too long to avoid filling up the disk. You can stop the recording by pressing
ros2 bag record /tf /pointcloud /utlidar/robot_odom /camera/compressed
Ctrl + Cand you need to download it from the container.
OPTIONAL Terminal 6 (save the map)
- Important: You must wait for the scanning to finish before saving the map. AND PLEASE CHANGE THE PATH!!!!!!!!!!!!!!!!
- Once the scanning is complete, you can save the voxel map using the following service call:
ros2 service call /save_voxel_cloud go2_interfaces/srv/SaveVoxelCloud "{filename: '/YOUR FILE PATH/export.pcd'}"
- In the first terminal, bring up the robot dog: