Author: Huanyu Li, DeepSeek
This repository contains the necessary tools and configurations for depth camera perception using UR10e robots. It supports the ZED Camera and Azure Kinect, along with MoveIt configurations, the Industrial Reconstruction package, and a Commander Node for easy manipulation of UR robots.
Before using this repository, ensure you have the following installed and configured:
-
Docker: Install Docker to containerize the environment.
-
NVIDIA Container Toolkit: Required for GPU acceleration with Docker.
-
Visual Studio Code (VSCode): Recommended for development and debugging.
-
UR10e Robot: Ensure your UR10e robot is properly configured and connected. Please setup your machine ip address to 192.168.56.1
The repository includes the following components:
-
Camera Drivers:
- ZED Camera driver for depth perception.
- Azure Kinect driver for depth perception.
-
MoveIt Configurations:
- Pre-configured MoveIt setup for UR10e robots with the end-effector.
-
Industrial Reconstruction Package:
- Tools for 3D reconstruction and environment mapping.
-
Commander Node:
- A user-friendly node for easy manipulation and control of UR robots.
Clone this repository to your local machine, remeber fork it if you want to develop your application:
git clone https://github.com/your-username/MRAC-UR-Perception.git
cd MRAC-UR-PerceptionTo build the image.
.docker/build_image.sh
To run the image.
.docker/run_user.sh
You may need to change the owner of the dev_ws, copy the line showing on the terminal.
sudo chown -R [YOUR USER NAME] /dev_ws
Start a terminal
terminator
Inside the Docker container, launch the desired perception pipeline:
For ZED Camera:
roslaunch zed_wrapper zedm.launchFor Azure Kinect:
roslaunch azure_kinect_ros_driver driver.launchIf you want to simulate the robot with a fake controller:
roslaunch ur10e_moveit_config demo.launchIf you want to connect to the real robot with zed camera mount on the end-effector:
roslaunch commander ur10e_zed_commander.launchIf you want to connect the real robot with the Azure Kinect stationary placed:
roslaunch commander ur10e_ka_commander.launchThe tutorial for hand-eye application can be sourced from this link
- Choose the camera image topics from your camera
- Choose the frames according to your application
- FreeDrive the robot to different poses, click save sample, and repeat this step until you receive 10-15 samples
- Save the camera pose to a new launch file
- Integrate the camera pose to your project's launch node
Example 1: 3D Reconstruction Launch the Industrial Reconstruction package:
roslaunch commander reconstruction.launchUse the Commander nodebook to move the robot and capture data.
Docker GPU Issues: Ensure the NVIDIA Container Toolkit is installed and configured correctly.
Camera Connectivity: Check the camera connections and ensure the drivers are properly installed.
ROS Communication: Verify that the ROS master is running and all nodes are communicating.
Contributions are welcome! Please fork the repository and submit a pull request with your changes.
This project is licensed under the MIT License. See the LICENSE file for details.