Skip to content

Conversation

@dehann
Copy link
Member

@dehann dehann commented Dec 17, 2025

No description provided.

@dehann dehann added enhancement New feature or request refactor design labels Dec 17, 2025
@dehann dehann added this to the v0.11.0 milestone Dec 17, 2025
@dehann dehann self-assigned this Dec 17, 2025

Statistics.mean(m::MvNormalKernel) = m.μ # mean(m.p)
Statistics.cov(m::MvNormalKernel) = cov(m.p) # note also about m.sqrt_iΣ
Statistics.std(m::MvNormalKernel) = sqrt(cov(m)) # regular sqrt (not of inverse)
Copy link
Member Author

@dehann dehann Dec 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @Affie , I've been working through ManellicTree as prototype for HomotopyBelief. I'm calling here to check for discussion points on:

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For factors, the common entry point is getMeasurementParametric, calling mean and invcov.
For variables .val and .bw is used directly.

Copy link
Member Author

@dehann dehann Dec 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you have a good example (recently) of using .val perhaps please? I'm trying to figure out if that should be renamed to getPoints or getModes etc...

I'm looking for a not dehann usage example if you have one pls

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member Author

@dehann dehann Dec 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

okay thanks -- so I think the way to go here is something like modes = getMixture(_, nmodes=1) which returns a Vector{ConcentratedGaussian{M}} (legacy name is MvNormalKernel). From there manipoint := getMean(modes[1]) and sqrt(1/Σ^2) := getInformationmat(modes[1]); also 1/Σ^2 := getPrecisionmat(modes[1]).

There is some nuance on how getMixture works. I think just return the kernels (i.e. mixture) at depth = ceil(log2(nmodes)) in the belief tree.

This also means that convenience wrappers can exist (but we miss the opportunity to promote the new solver) getMean(getBelief(getVariable(dfg,:w_X1))).

My sense at the moment is that we should promote getBelief and getMixture as the front facing API. We can then find a way to document getPoints(belief) or cov(getMixture(_, 1)). This would also mean happenstance equivalents such as getPoints(getMixture(fg, lb, nmodes=1))[1] == getMean(getBelief(fg, lb)), where the second is the least desirable signature in terms of overall UX.

@dehann dehann requested a review from Affie December 17, 2025 13:40
@dehann
Copy link
Member Author

dehann commented Dec 17, 2025

adding assign for project board filters.

- FIXME use manifold mean and cov calculation instead
- TODO, use recursive power series for next largest eigen vector down depth of tree for efficiency
"""
function splitPointsEigen(
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note, I already did some previous effort towards the homotopy concept by using eigen for finding the next largest variance down the tree. Just adding the note here that a much more efficient approach would be to only extract the largest eigen vector via Krylov methods instead of recalculating the whole eigen decomp each time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

design enhancement New feature or request refactor

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants