-
Notifications
You must be signed in to change notification settings - Fork 919
IncrementalFixedLagSmoother throws due to unused keys #2478
Copy link
Copy link
Closed
Description
Description
Whenever a key becomes unused due to factors being removed in a call to
IncrementalFixedLagSmoother::update an Exception: map::at is raised when the
smoother tries to perform a marginalization that would require some work on the
unused key. This is caused because IncrementalFixedLagSmoother is not removing
the unused key from its internal key-timestamp database.
Steps to reproduce
- Update
IncrementalFixedLagSmootheradding some values and factors - Update again, but now removing some factors using
factorsToRemoveso that
one or more values become "unused" (because unlinked to the rest of the
graph). - Keep updating until it triggers an marginalization that would involve one of
the unused keys if the factors weren't removed.
Minimal (Non-)Working Example
#include <gtsam/base/debug.h>
#include <gtsam/inference/Symbol.h>
#include <gtsam/nonlinear/IncrementalFixedLagSmoother.h>
#include <gtsam/slam/BetweenFactor.h>
int main() {
using namespace gtsam;
using symbol_shorthand::X;
SETDEBUG("IncrementalFixedLagSmoother update", true);
SharedDiagonal noise = noiseModel::Diagonal::Sigmas(Vector2(0.1, 0.1));
typedef IncrementalFixedLagSmoother::KeyTimestampMap Timestamps;
IncrementalFixedLagSmoother smoother(2.0, ISAM2Params());
size_t i = 0;
// Add a prior at time 0
{
Key key0 = X(0);
NonlinearFactorGraph newFactors;
Values newValues;
Timestamps newTimestamps;
newFactors.addPrior(key0, Point2(0.0, 0.0), noise);
newValues.insert(key0, Point2(0.01, 0.01));
newTimestamps[key0] = 0.0;
smoother.update(newFactors, newValues, newTimestamps);
++i;
}
FactorIndices factorsToRemove;
size_t ref_i = 0;
// Helper lambda to setup an usage of the smoother where we add new values and
// update at a certain frequency, but do not keep all added values in the
// smoother in a long term. This is achieved by storing the indexes to the new
// factors being added, and using them to remove the factors later. The
// removal of factors may cause keys to be unused and removed from the
// smoother.
const auto add_x_keep_every_y = [&](size_t n_steps, size_t keep_interval) {
for (auto j = 0ul; j < n_steps; j++) {
Key key1 = X(ref_i);
Key key2 = X(i);
NonlinearFactorGraph newFactors;
Values newValues;
Timestamps newTimestamps;
newFactors.push_back(
BetweenFactor<Point2>(key1, key2, Point2(1.0, 0.0), noise));
newFactors.addPrior(key2, Point2(double(i), 0.0), noise);
newValues.insert(key2, Point2(double(i) + 0.1, -0.1));
newTimestamps[key2] = double(i);
smoother.update(newFactors, newValues, newTimestamps, factorsToRemove);
// Store indexes of new factors so we're able to remove them later, if
// needed.
factorsToRemove = smoother.getISAM2Result().newFactorsIndices;
// Decide if we want to keep the new value in the smoother
if ((j + 1) % keep_interval == 0) {
// We want to keep it. Clear factorsToRemove so newValue stays
// connected to the graph.
factorsToRemove.clear();
// Store new reference
ref_i = i;
}
++i;
}
};
// Add odometry from time 0 to time 4, with a prior factor on each new value.
// Keep every second new value.
//
// This is what is happening:
//
// - 1st iteration: Add x1
// x0 x1
// ^------------^
// (time window)
//
// - 2nd iteration: Add x2 and remove x1 (because unlinked)
// x0 (x1) x2
// ^------------^
//
// - 3rd iteration: Add x3 and remove x0 (due to marginalization). This will
// cause an exception because x1 will be sought but won't exist anymore in
// the smoother.
//
// (x0) (x1) x2 x3
// ^------------^
add_x_keep_every_y(4, 2);
return 0;
}
Output
IncrementalFixedLagSmoother::update() Start
Bayes Tree Before Update:
{Empty Tree}
END
Current Timestamp: 0
Marginalizable Keys:
Constrained Keys:
Bayes Tree After Update, Before Marginalization:
P( x0 )
END
Final Bayes Tree:
P( x0 )
END
IncrementalFixedLagSmoother::update() Finish
IncrementalFixedLagSmoother::update() Start
Bayes Tree Before Update:
P( x0 )
END
Current Timestamp: 1
Marginalizable Keys:
Constrained Keys:
Bayes Tree After Update, Before Marginalization:
P( x0 x1 )
END
Final Bayes Tree:
P( x0 x1 )
END
IncrementalFixedLagSmoother::update() Finish
IncrementalFixedLagSmoother::update() Start
Bayes Tree Before Update:
P( x0 x1 )
END
Current Timestamp: 2
Marginalizable Keys:
Constrained Keys:
Bayes Tree After Update, Before Marginalization:
P( x0 x2 )
END
Final Bayes Tree:
P( x0 x2 )
END
IncrementalFixedLagSmoother::update() Finish
IncrementalFixedLagSmoother::update() Start
Bayes Tree Before Update:
P( x0 x2 )
END
Current Timestamp: 3
Marginalizable Keys: x0
Constrained Keys: x0(0) x1(1) x2(1) x3(1)
terminate called after throwing an instance of 'std::out_of_range'
what(): map::at
Aborted (core dumped)
Expected behavior
Do not throw an exception: IncrementalFixedLagSmoother should be capable to
handle values being removed from it (i.e. value being "unlinked" from the graph
when removing factors in a update).
Environment
Fedora 42, C++. Bug confirmed to be present at least on on 4.2a9, 4.3a0 and
current develop (commit 75b421980)
Additional information
Bug-fix proposed on #2474
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels