Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions src/entities/KernelEval.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@
abstract type AbstractKernel end

@kwdef struct MvNormalKernel{P, T, M, iM} <: AbstractKernel
""" On-manifold point representing center (mean) of the MvNormal distrubtion """
""" On-manifold point representing center (mean) of the MvNormal distribution """
μ::P
""" Zero-mean normal distrubtion with covariance """
""" Zero-mean normal distribution with covariance """
p::MvNormal{T, M}
# TDB might already be covered in p.Σ.chol but having issues with SymPD (not particular to this AMP repo)
""" Manually maintained square root concentration matrix for faster compute, TODO likely duplicate of existing Distrubtions.jl functionality. """
Expand Down
70 changes: 42 additions & 28 deletions src/services/KernelEval.jl
Original file line number Diff line number Diff line change
@@ -1,8 +1,49 @@

## =============================================================================
## Function Overloads

# NOTES,
# - ManellicTree kernel types have mean and cov methods for easy access
# - ManellicTree currently only supports MvNormalKernel types

Statistics.mean(m::MvNormalKernel) = m.μ # mean(m.p)
Statistics.cov(m::MvNormalKernel) = cov(m.p) # note also about m.sqrt_iΣ
Statistics.std(m::MvNormalKernel) = sqrt(cov(m)) # regular sqrt (not of inverse)
Copy link
Member Author

@dehann dehann Dec 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @Affie , I've been working through ManellicTree as prototype for HomotopyBelief. I'm calling here to check for discussion points on:

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For factors, the common entry point is getMeasurementParametric, calling mean and invcov.
For variables .val and .bw is used directly.

Copy link
Member Author

@dehann dehann Dec 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you have a good example (recently) of using .val perhaps please? I'm trying to figure out if that should be renamed to getPoints or getModes etc...

I'm looking for a not dehann usage example if you have one pls

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member Author

@dehann dehann Dec 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

okay thanks -- so I think the way to go here is something like modes = getMixture(_, nmodes=1) which returns a Vector{ConcentratedGaussian{M}} (legacy name is MvNormalKernel). From there manipoint := getMean(modes[1]) and sqrt(1/Σ^2) := getInformationmat(modes[1]); also 1/Σ^2 := getPrecisionmat(modes[1]).

There is some nuance on how getMixture works. I think just return the kernels (i.e. mixture) at depth = ceil(log2(nmodes)) in the belief tree.

This also means that convenience wrappers can exist (but we miss the opportunity to promote the new solver) getMean(getBelief(getVariable(dfg,:w_X1))).

My sense at the moment is that we should promote getBelief and getMixture as the front facing API. We can then find a way to document getPoints(belief) or cov(getMixture(_, 1)). This would also mean happenstance equivalents such as getPoints(getMixture(fg, lb, nmodes=1))[1] == getMean(getBelief(fg, lb)), where the second is the least desirable signature in terms of overall UX.



function Base.show(io::IO, mvk::MvNormalKernel)
μ = mean(mvk)
Σ2 = cov(mvk)
# Σ=sqrt(Σ2)
d = size(Σ2, 1)
print(io, "MvNormalKernel(d=", d)
print(io, ",μ=", round.(μ; digits = 3))
print(io, ",Σ^2=[", round(Σ2[1]; digits = 3))
if 1 < d
print(io, "...")
end
# det(T-I) is a proxy through volume meaure of Transform from unit covariance matrix to this instance
# i.e. how large or rotated is this covariance instance
println(
io,
"]); det(T-I)=",
round(det(covTransformNormalized(Σ2) - diagm(ones(d))); digits = 3),
)
# ; det(Σ)=",round(det(Σ);digits=3), "
return nothing
end

Base.show(io::IO, ::MIME"text/plain", mvk::MvNormalKernel) = show(io, mvk)



## =============================================================================
## Various kernel accessors and functions

# also makes static
function projectSymPosDef(c::AbstractMatrix)
s = size(c)
# pretty fast to make or remake isbitstype form matrix
# pretty fast to make or remake isbitstype from matrix
_c = SMatrix{s...}(c)
#TODO likely not intended project here: see AMP#283
return issymmetric(_c) ? _c : project(Manifolds.SymmetricPositiveDefinite(s[1]), _c, _c)
Expand All @@ -19,10 +60,6 @@ function MvNormalKernel(μ::AbstractArray, σ::AbstractArray, weight::Real = 1.0
return MvNormalKernel(; μ, p, sqrt_iΣ, weight = float(weight))
end

Statistics.mean(m::MvNormalKernel) = m.μ # mean(m.p) # m.p.μ
Statistics.cov(m::MvNormalKernel) = cov(m.p) # note also about m.sqrt_iΣ
Statistics.std(m::MvNormalKernel) = sqrt(cov(m))

function updateKernelBW(k::MvNormalKernel, _bw, isq_bw = inv(sqrt(_bw)))
p = MvNormal(_bw)
sqrt_iΣ = typeof(k.sqrt_iΣ)(isq_bw)
Expand Down Expand Up @@ -58,29 +95,6 @@ function covTransformNormalized(Σ::AbstractMatrix)
return R * S
end

function Base.show(io::IO, mvk::MvNormalKernel)
μ = mean(mvk)
Σ2 = cov(mvk)
# Σ=sqrt(Σ2)
d = size(Σ2, 1)
print(io, "MvNormalKernel(d=", d)
print(io, ",μ=", round.(μ; digits = 3))
print(io, ",Σ^2=[", round(Σ2[1]; digits = 3))
if 1 < d
print(io, "...")
end
# det(T-I) is a proxy through volume meaure of Transform from unit covariance matrix to this instance
# i.e. how large or rotated is this covariance instance
println(
io,
"]); det(T-I)=",
round(det(covTransformNormalized(Σ2) - diagm(ones(d))); digits = 3),
)
# ; det(Σ)=",round(det(Σ);digits=3), "
return nothing
end

Base.show(io::IO, ::MIME"text/plain", mvk::MvNormalKernel) = show(io, mvk)

function distanceMalahanobisCoordinates(
M::AbstractManifold,
Expand Down
21 changes: 15 additions & 6 deletions src/services/ManellicTree.jl
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
# end
# end

# number of data points (aka particles) in tree, i.e. N
Base.length(::ManellicTree{M, D, N}) where {M, D, N} = N

getPoints(mt::ManellicTree) = view(mt.data, mt.permute)
Expand All @@ -16,7 +17,7 @@ getWeights(mt::ManellicTree) = view(mt.weights, mt.permute)
# _getleft(i::Integer, N) = 2*i + (2*i < N ? 0 : 1)
# _getright(i::Integer, N) = _getleft(i,N) + 1

# either leaf or tree kernel, if larger than N
# either tree or leaf kernel, if larger than N
function leftIndex(mt::ManellicTree, krnIdx::Int = 1)
return 2 * krnIdx + (2 * krnIdx < length(mt) ? 0 : 1)
end
Expand Down Expand Up @@ -51,6 +52,9 @@ getKernelLeafAsTreeKer(

Return kernel from tree by binary tree index, and convert leaf kernels to tree kernel types if necessary.

Notes:
- BinaryTree (BT) index goes from root=1 to largest leaf 2*N

See also: [`getKernelLeafAsTreeKer`](@ref)
"""
function getKernelTree(
Expand Down Expand Up @@ -92,8 +96,9 @@ function getKernelTree(
end
end


# check for existence in tree or leaves
function exists_BTLabel(mt::ManellicTree{M, D, N}, idx::Int) where {M, D, N}
# check for existence in tree or leaves
eset = if idx < N
mt._workaround_isdef_treekernel
else
Expand All @@ -115,8 +120,11 @@ function isLeaf_BTLabel(mt::ManellicTree{M, D, N}, idx::Int) where {M, D, N}
end
end

# check for uniform weights
uniWT(mt::ManellicTree) = 1 === length(union(diff(getWeights(mt))))


# check for uniform bandwidths in kernels
function uniBW(mt::ManellicTree{M, D, N}) where {M, D, N}
if 1 < length(mt.leaf_kernels)
bw = cov(mt.leaf_kernels[1])
Expand Down Expand Up @@ -219,6 +227,7 @@ function Base.convert(
return src
end

# case for different types requiring conversion
function Base.convert(
::Type{MvNormalKernel{P, T, M, iM}},
src::MvNormalKernel,
Expand All @@ -231,7 +240,7 @@ function Base.convert(
return MvNormalKernel{P, T, M, iM}(μ, p, sqrt_iΣ, src.weight)
end

# covariance
# covariance eigen decomposition and sort ascending
function eigenCoords(f_CVp::AbstractMatrix)
function _decomp(evc::AbstractMatrix, evl::AbstractVector, _toflip::Bool = det(evc) < 0)
pidx = _toflip ? sortperm(evl; rev = true) : 1:length(evl)
Expand All @@ -253,10 +262,10 @@ end

Give vector of manifold points and split along largest covariance (i.e. major direction)

DeVNotes:
DevNotes:
- FIXME: upgrade to Manopt version
- https://github.com/JuliaRobotics/ApproxManifoldProducts.jl/issues/277
- FIXME use manifold mean and cov calculation instead
- TODO, instead use Krylov methods (e.g. recursive power series) for next largest eigen vector down depth of tree for efficiency
"""
function splitPointsEigen(
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note, I already did some previous effort towards the homotopy concept by using eigen for finding the next largest variance down the tree. Just adding the note here that a much more efficient approach would be to only extract the largest eigen vector via Krylov methods instead of recalculating the whole eigen decomp each time.

M::AbstractLieGroup,
Expand Down Expand Up @@ -437,7 +446,7 @@ Notes:
DevNotes:
- Design Decision 24Q1, Manellic.MvNormalKernel bandwidth defs should ALWAYS ONLY BE covariances, because
- Vision state is multiple bandwidth kernels including off diagonals in both tree or leaf kernels
- Hybrid parametric to leafs convariance continuity
- Hybrid parametric to leafs covariance continuity
- https://github.com/JuliaStats/Distributions.jl/blob/a9b0e3c99c8dda367f69b2dbbdfa4530c810e3d7/src/multivariate/mvnormal.jl#L220-L224
"""
function buildTree_Manellic!(
Expand Down