Conversation
dellaert
left a comment
There was a problem hiding this comment.
The CI fails. That’s probably easy to fix, but a bigger problem is that I don’t really understand what you’re saying about “re-introducing information”. The whole point of dead mode removal is getting rid of information, so I think this needs more explanation. Maybe you can give an example in PR comment that is also implemented in a unit test.
gtsam/discrete/DiscreteBayesNet.cpp
Outdated
| /* ************************************************************************* */ | ||
| // The implementation is: build the entire joint into one factor and then prune. | ||
| // TODO(Frank): This can be quite expensive *unless* the factors have already | ||
| // NOTE: This can be quite expensive *unless* the factors have already |
There was a problem hiding this comment.
Please add my name back. For any notes/to dos we should have a name.
| for (DiscreteKey dkey : newFactors.discreteKeys()) { | ||
| Key key = dkey.first; | ||
| if (fixedValues_.find(key) != fixedValues_.end()) { | ||
| // Add corresponding discrete factor to reintroduce the information |
There was a problem hiding this comment.
I updated the PR comment. I can add a unit test later if that's okay? I'll put it in as a TODO.
|
OK but then I guess you have to keep track of which variables were removed? That can be a pretty long list. And where do you keep this list? Or maybe I’m still not understanding. |
|
We keep track of those variables in a DiscreteValues object called fixedValues_. This is why this PR was pretty straightforward once I figured out the issue. :) |
TableFactorto only record probabilities above 1e-11. This helps keep the TableFactor smaller and improve sparsity.DiscreteBayesNet::jointas a new method since it is a common operation.Here is an illustrative example:
$BN = P(X(0) | M(0))P(M(0))$
Say we have a HybridBayesNet with the conditionals
where
X(0)is a continuous variable andM(0)is a discrete variable. AssumingP(M(0))was0.1, 0.9, then DMR (with threshold 0.85) would remove this conditional and assignM(0)=1.In the next update step, we add the factor$\phi(X(1), M(0))$ , and the discrete measurement $\phi(M(0))$ . Note that we have reintroduced
$FG = \phi(X(1), M(0)) \phi(M(0)) P(M(0))$ .
M(0)again and there is new information.From our previous step,
M(0) = 1, but the new factors might make it such thatM(0)=0is the better assignment.The way I handle that is to reintroduce
P(M(0)) = {0.15, 0.85}so we get the graph:This way, when we eliminate the information from the previous and current steps are fused giving us a better estimate of
M(0).