This repository provides a Dockerized ROS 2 conversion pipeline that exports rosbag2 data to omega-prime .mcap files.
The conversion pipeline is based on the perception_interfaces. It scans rosbag2 recordings, reads EgoData and ObjectList topics, and resolves /tf + /tf_static transforms into a configurable fixed frame. The converter writes one .mcap per bag and supports optional OpenDRIVE map embedding and schema validation for downstream analytics workflows.
For further processing of resulting omega-prime files, see the main omega-prime repository.
You need to have installed Docker to be able to convert ROS 2 bags to Omega-Prime MCAP files
Use the Docker image to run the converter automatically. It will discovers rosbag2 folders and writes omega-prime files to the output folder.
- Build or pull the
omega-prime-rosimage - Mount your ROS bag folder to
/input - Mount an output directory to
/output - Optionally mount
/map/map.xodrto embed OpenDRIVE map data - Set at least one topic (
EGO_DATA_TOPICand/orOBJECT_LIST_TOPIC) - Run the container
docker run --rm -it \
-e EGO_DATA_TOPIC=</your/ego_data_topic> \
-e OBJECT_LIST_TOPIC=</your/object_list_topic> \
-v <path/to/bags>:/input \
-v </path/to/map.xodr>:/map/map.xodr \
-v "$PWD"/output:/output \
ghcr.io/ika-rwth-aachen/omega-prime-ros:latestWe provide an examplaric ROS 2 bag file from simulation, which can be used to generate an omega-prime file.
docker run --rm \
-e EGO_DATA_TOPIC=/simulation/ego_data \
-e OBJECT_LIST_TOPIC=/simulation/object_list \
-v "$PWD"/example:/input \
-v "$PWD"/output:/output \
ghcr.io/ika-rwth-aachen/omega-prime-ros:latestThe converter writes an .omega-prime.mcap file to the output/ directory.
Environment variables and CLI flags:
BAG_DIR/--bag-dir(default/input)OP_DIR/--op-dir(default/output)EGO_DATA_TOPIC/--ego_data_topicOBJECT_LIST_TOPIC/--object_list_topicFIXED_FRAME/--fixed_frame(defaultutm_32N)MAP/--map(default/map/map.xodr)BAG/--bagto process explicit bags (supports comma-separated paths)VALIDATE/--validateenable omega-prime schema validationWARN_GAP_SECONDS/--warn-gap-secondswarning threshold in seconds if same object ID appears multiple times
- The converter scans
/inputfor rosbag2 directories containing ametadata.yamland writes one omega-prime.mcapper bag into/outputby default. - For large bags, ensure sufficient RAM.
- The converter reads
/tfand/tf_staticand resolves each EgoData and ObjectList message frame against the configuredfixed_frame. - The
fixed_frameshould be the georeferenced top-level ROS coordinate frame (TF root), for example a global UTM/world frame. - When
fixed_frame=map, the map must be parsed and the map projection string is used. - These transforms are stored in omega-prime as per-timestamp
ProjectionOffsetmetadata. - The fixed frame is converted to an EPSG projection string and written as
projections["proj_string"]. - Supported
fixed_framevalues:utm_<zone: int>[N/S]andmap(e.g.utm_30N).
The provided image bundles ROS 2 Jazzy, rosbag2 Python bindings, omega-prime, and builds perception_interfaces from GitHub so EgoData and ObjectList topics can be exported to omega-prime MCAP.
OMEGA_PRIME_VERSION(defaultlatest): PyPI version to installPERCEPTION_INTERFACES_VERSION(optional): commit/branch/tag; if unset, the repo default branch is used
docker build -t ghcr.io/ika-rwth-aachen/omega-prime-ros:latest \
--build-arg OMEGA_PRIME_VERSION=latest \
--build-arg PERCEPTION_INTERFACES_VERSION=<commit-or-branch> \
-f Dockerfile .This package is developed as part of the SYNERGIES project.
Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or European Climate, Infrastructure and Environment Executive Agency (CINEA). Neither the European Union nor the granting authority can be held responsible for them.
Important
The project is open-sourced and maintained by the Institute for Automotive Engineering (ika) at RWTH Aachen University. We cover a wide variety of research topics within our Vehicle Intelligence & Automated Driving domain. If you would like to learn more about how we can support your automated driving or robotics efforts, feel free to reach out to us! Contact: opensource@ika.rwth-aachen.de