Skip to content

LimHyungTae/patchwork

Repository files navigation

Patchwork



Video   •   Install by ROS   •   Paper   •   Project Wiki (for beginners)

animated animated


IMPORTANT: (Aug. 18th, 2024) I employ TBB, so its FPS is increased from 50 Hz to 100 Hz! If you want to use the paper version of Patchwork for SOTA comparison purpose, Please use this ground seg. benchmark code.

Patchwork Concept of our method (CZM & GLE)

It's an overall updated version of R-GPF of ERASOR [Code] [Paper].


📂 Contents

  1. Test Env.
  2. Requirements
  3. How to Run Patchwork
  4. Citation

Test Env.

The code is tested successfully at

  • Linux 24.04 LTS
  • ROS2 Jazzy

ROS Noetic version can be found here

📦 Prerequisite Installation

mkdir -p ~/colcon/src
cd ~/colcon/src
git clone https://github.com/LimHyungTae/patchwork.git
cd ..
colcon build --packages-up-to patchwork --cmake-args -DCMAKE_BUILD_TYPE=Release

⚙️ How to Run Patchwork

📈 Offline KITTI dataset

  1. Download SemanticKITTI Odometry dataset (We also need labels since we also open the evaluation code! :)

  2. The dataset_path should consist of velodyne folder and labels folder as follows:

data_path (e.g. 00, 01, ..., or 10)
_____velodyne
     |___000000.bin
     |___000001.bin
     |___000002.bin
     |...
_____labels
     |___000000.label
     |___000001.label
     |___000002.label
     |...
_____...

  1. Run launch file
ros2 launch patchwork evaluate.launch.yaml evaluate_semantickitti:=true dataset_path:=<YOUR_TARGET_SEQUENCE_DIR>"
e.g.,
ros2 launch patchwork evaluate.launch.yaml evaluate_semantickitti:=true dataset_path:="/home/hyungtae_lim/semkitti/dataset/sequences/04"

🏃 Online Ground Segmentation

ros2 launch patchwork run_patchwork.launch.yaml scan_topic:=<YOUR_TOPIC_NAME> sensor_type:=<YOUR_SENSOR_TYPE>
e.g.,
ros2 launch patchwork run_patchwork.launch.yaml scan_topic:="/acl_jackal2/lidar_points" sensor_type:="velodyne16"

For better understanding of the parameters of Patchwork, please read our wiki, 4. IMPORTANT: Setting Parameters of Patchwork in Your Own Env..


Citation

If you use our code or method in your work, please consider citing the following:

@article{lim2021patchwork,
title={Patchwork: Concentric Zone-based Region-wise Ground Segmentation with Ground Likelihood Estimation Using a 3D LiDAR Sensor},
author={Lim, Hyungtae and Minho, Oh and Myung, Hyun},
journal={IEEE Robotics and Automation Letters},
year={2021}
}

Updates

NEWS (22.12.24)

  • Merry christmas eve XD! include/label_generator is added to make the .label file, following the SemanticKITTI format.
  • The .label files can be directly used in 3DUIS benchmark
  • Thank Lucas Nunes and Xieyuanli Chen for providing code snippets to save a .label file.

NEWS (22.07.25)

  • Pybinding + more advanced version is now available on Patchwork++ as a preprocessing step for deep learning users (i.e., python users can also use our robust ground segmentation)!

NEWS (22.07.13)

  • For increasing convenience of use, the examples and codes are extensively revised by reflecting issue #12.

NEWS (22.05.22)

  • The meaning of elevation_thresholds is changed to increase the usability. The meaning is explained in wiki.
  • A novel height estimator, called All-Terrain Automatic heighT estimator (ATAT) is added within the patchwork code, which auto-calibrates the sensor height using the ground points in the vicinity of the vehicle/mobile robot.
    • Please refer to the function consensus_set_based_height_estimation().

About

SOTA fast and robust ground segmentation using 3D point cloud (accepted in RA-L'21 w/ IROS'21)

Topics

Resources

License

Stars

Watchers

Forks

Contributors 4

  •  
  •  
  •  
  •