Description
Issue:
With Sync Mode:On
there are instances when the framesets received by BaseRealSenseNode::frame_callback
sometimes do not contain both depth and color frames, resulting no publication of the aligned_depth_to_color and pointcloud2 ROS message.
The color frame sometimes arrives at a slightly later timestamp to the depth frame, and rs2::asynchronous_syncer
treats them as different framesets. The behaviour of asynchronous_syncer
is corroborated by this issue: IntelRealSense/librealsense#2662
Within ROS - I use message_filters
with an ExactTime
policy (http://wiki.ros.org/message_filters) to retrieve synchronised pointcloud2, camera_info, color and depth messages, and this causes periods of no messages when aligned_depth_to_color and pointcloud2 is not published. These dropped frames are currently causing issues in my application. I would prefer to receive full framesets, even if the color frame is slightly off sync, rather than the frames being dropped.
Mitigation:
An aggregator
filter (https://github.com/IntelRealSense/librealsense/blob/HEAD/src/pipeline/aggregator.h) which fills missing messages with frameset has been implemented for rs2::pipeline
.
Will it be possible to port it as another filter?
I'm currently working around the issue with this workaround, but would love to see a solution merged into main:
diff --git a/realsense2_camera/src/base_realsense_node.cpp b/realsense2_camera/src/base_realsense_node.cpp
index b654fc3..a589bc7 100644
--- a/realsense2_camera/src/base_realsense_node.cpp
+++ b/realsense2_camera/src/base_realsense_node.cpp
@@ -873,8 +873,40 @@ void BaseRealSenseNode::setupDevice()
{
frame_callback_function = _syncer;
+ rs2::filter* aggregate_filter_ptr = new rs2::filter([this](rs2::frame f, rs2::frame_source& src) {
+ auto frameset = f.as<rs2::frameset>();
+ std::vector<rs2::frame> bundle;
+ bool has_color = false;
+ for (auto it = frameset.begin(); it != frameset.end(); ++it) {
+ auto f = (*it);
+ bundle.push_back(f);
+
+ if (f.get_profile().stream_type() == RS2_STREAM_COLOR) {
+ has_color = true;
+ aggregate_filter_color_history.clear();
+ aggregate_filter_color_history.push_back(f);
+ }
+ }
+
+ if (!has_color && !aggregate_filter_color_history.empty()) {
+ bundle.push_back(aggregate_filter_color_history[0]);
+ }
+
+ auto fs = src.allocate_composite_frame(bundle);
+ src.frame_ready(fs);
+ });
+ aggregate_filter = std::shared_ptr<rs2::filter>(aggregate_filter_ptr);
+
auto frame_callback_inner = [this](rs2::frame frame){
- frame_callback(frame);
+ if (frame.is<rs2::frameset>()) {
+ frame = aggregate_filter->process(frame);
+ auto frameset = frame.as<rs2::frameset>();
+ if (frameset.size() > 1) {
+ frame_callback(frame);
+ }
+ } else {
+ frame_callback(frame);
+ }
};
_syncer.start(frame_callback_inner);
}
diff --git a/realsense2_camera/include/base_realsense_node.h b/realsense2_camera/include/base_realsense_node.h
index 42b2d2c..2c2514f 100644
--- a/realsense2_camera/include/base_realsense_node.h
+++ b/realsense2_camera/include/base_realsense_node.h
@@ -302,6 +302,8 @@ namespace realsense2_camera
std::string _filters_str;
stream_index_pair _pointcloud_texture;
PipelineSyncer _syncer;
+ std::vector<rs2::frame> aggregate_filter_color_history;
+ std::shared_ptr<rs2::filter> aggregate_filter;
std::vector<NamedFilter> _filters;
std::shared_ptr<rs2::filter> _colorizer, _pointcloud_filter;
std::vector<rs2::sensor> _dev_sensors;