Peiyu Chen†, Fuling Lin†, Weipeng Guan, Peng Lu*
Adaptive Robotic Controls Lab (ArcLab), The University of Hong Kong.
SuperEIO is a novel event-based visual inertial odometry framework that leverages self-supervised learning networks to enhance the accuracy and robustness of ego-motion estimation. Our event-only feature detection employs a convolutional neural network under continuous event streams. Moreover, our system adopts the graph neural network to achieve event descriptor matching for loop closure. The proposed system utilizes TensorRT to accelerate the inference speed of deep networks, which ensures low-latency processing and robust real-time operation on resource-limited platforms. Besides, we evaluate our method extensively on multiple challenging public datasets, particularly in high-speed motion and high-dynamic-range scenarios, demonstrating its superior accuracy and robustness compared to other state-of-the-art event-based methods.
We test our SuperEIO on Ubuntu 20.04. Before you build the SuperEIO, you should install the following dependency:
- Ceres 1.14.0
- OpenCV 4.2
- Eigen 3
- TensorRT 8.4.1.5
- CUDA 11.6
- ROS noetic
Other event camera drivers are stored in the folder dependencies
.
mkdir -p catkin_ws_supereio/src
cd catkin_ws_supereio
catkin config --init --mkdirs --extend /opt/ros/noetic --merge-devel --cmake-args -DCMAKE_BUILD_TYPE=Release
cd ~/catkin_ws_supereio/src
git clone [email protected]:your-repo/SuperEIO.git --recursive
After that, run the source ~/.bashrc
and supereiobuild
command in your terminal.
You can test our SuperEIO on hku_agg_translation. After you download bag files, just run the example:
roslaunch supereio_estimator supereio.launch
rosbag play YOUR_DOWNLOADED.bag
To run the system on your dataset, you need to create a corresponding configuration folder and YAML file in the ’config‘ directory. Then configure your camera intrinsics, event/IMU topics, and the extrinsic transformation between the event camera and IMU in the YAML file. For the extrinsic calibration, we recommend you follow the link (DVS-IMU Calibration and Synchronization) to kindly calibrate your sensors.
Following that, you can execute the provided command to run SuperEIO on your dataset:
roslaunch supereio_estimator supereio.launch
rosbag play YOUR_BAG.bag
We present the network architectures of our deep event feature detector and descriptor matcher, along with visualizations demonstrating their performance in event feature detection and descriptor matching.
SuperEIO is available in the Arxiv.
@article{SuperEIO,
title={SuperEIO: Self-Supervised Event Feature Learning for Event Inertial Odometry},
author={Chen, Peiyu and Lin, Fuling and Guan, Weipeng and Lu, Peng},
journal={arXiv preprint arXiv:2503.22963},
year={2025}
}
If you feel like SuperEIO has indeed helped in your current research or work, a simple star or citation of our works should be the best affirmation for us. 😊
The full codebase will be released upon paper acceptance. For immediate inquiries, please contact the authors.
This work was supported by the General Research Fund under Grant 17204222, and in part by the Seed Fund for Collaborative Research and General Funding Scheme-HKU-TCL Joint Research Center for Artificial Intelligence. We gratefully acknowledge sair-lab/AirSLAM for providing the Superpoint TensorRT acceleration template, which significantly enhanced the compute effiency of our system.
The source code is released under the GPLv3 license. We are still working on improving the code reliability. If you are interested in our project for commercial purposes, please contact Dr. Peng LU for further communication.