Skip to content

ControlSystemLab-UNNC/Adaptive-Active-Gaze-SLAM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 

Repository files navigation

Hyper-Active-Gaze-SLAM

A ROS2-based package for omni-directional robots with gimbal to actively control its gaze, to mitigate feature degradations and harsh movement-induced point cloud distortion, hence can safely navigate through unknown environments.

Dependencies

ROS2 (Tested with IRON)
PCL
livox-ros-driver2
livox-interfaces (For MID-70)
gridmap (https://github.com/ANYbotics/grid_map)
Atomic SLAM algorithms (Cartographer, Point-LIO, LOAM, Kiss-ICP)

How to use

First run all driver nodes of the sensors (IMU, 3D LiDAR, 2D LiDAR, camera etc) of your own systems, then run the interfaces nodes to the robot chassis and gimbal motor. Then launch this active gaze control node by calling ros2 launch livox_gaze_control gaze_control_launch.py. The node can run and the grid map can display as long as the Livox lidar messgaes are recieved, however, to toggle control of the gaze and update of gridmap, the interfaces to the IMU and gimbal motor as well as odometry and/or TF2 queries from the atomic SLAM algorithms are required. But if you simply want to visualize the gridmap, you can simply initialize the LiVOX driver node then run ros2 launch livox_gaze_control gaze_control_launch.py. The display of feature point cloud is not default, you can add it in rviz by finding the topic "edge_point" and "plane_point".

Debug potential issues when using LiVOX LiDARS (MID 70) with ROS-2

See https://blog.csdn.net/omnas/article/details/145163154

Acknowledgement

Thanks for the works:

LIO-Livox

gridmap

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors