Skip to content

ENH: Add GED transformer #13259

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 74 commits into
base: main
Choose a base branch
from
Open

ENH: Add GED transformer #13259

wants to merge 74 commits into from

Conversation

Genuster
Copy link
Contributor

@Genuster Genuster commented May 22, 2025

What does this implement/fix?

Adds transformer for generalized eigenvalue decomposition (or approximate joint diagonalization) of covariance matrices.
It generalizes xdawn, csp, ssd, and spoc algorithms.

Additional information

Steps:

  • test that it outputs identical filters and patterns as child classes for all tests by adding temporary assert_allclose calls in code
  • cover tests for _GEDTransformer and core functions
  • add _validate_params to _XdawnTransformer
  • add feature to perform GED in the principal subspace for Xdawn and SPoC
  • add option for CSP and SSD to select restr_type and provide info for CSP
  • add entry in mne's implementation details
  • move SSD and Xdawn pattern computation from np.linalg.pinv to mne's pinv
  • change SSD's multiplication order for dimension reduction for consistency
  • fix SSD's filters_ shape inconsistency
  • move mne.preprocessing._XdawnTransformer to decoding and make it public
  • rename _XdawnTransformer method_params to cov_method_params for consistency
  • SSD performs spectral sorting of components each time .transform() is applied. This could be optimized by sorting filters_, evals_ and patterns_ already in fit and will suit current GED design better
  • in SSD's .transform() when return_filtered=True, subsetting with self.picks_ is done two times - looks like bug
  • remove assert_allclose calls in code
  • clean up newly redundant code from the child classes
  • make ssd use mne.cov.compute_whitener() instead of its own whitener implementation. It's not identical algebraically, but conceptually seems to do the same thing

Then it should be ready for merge!

Sorry, something went wrong.

@larsoner
Copy link
Member

Already have a failure but fortunately it's just a tol issue I think:

mne/decoding/tests/test_csp.py:444: in test_spoc
    spoc.fit(X, y)
mne/decoding/csp.py:985: in fit
    np.testing.assert_allclose(old_filters, self.filters_)
E   AssertionError: 
E   Not equal to tolerance rtol=1e-07, atol=0
E   
E   Mismatched elements: 1 / 100 (1%)
E   Max absolute difference among violations: 9.04019248e-09
E   Max relative difference among violations: 1.11806536e-07
E    ACTUAL: array([[  2.037415,   1.424886,   2.718162,  -3.07798 ,  -3.862132,
E             1.412549,  -3.821452,   1.276637,   1.899782,  -2.389858],
E          [ 11.534231, -22.178034, -12.321628, -52.410096,  62.876084,...
E    DESIRED: array([[  2.037415,   1.424886,   2.718162,  -3.07798 ,  -3.862132,
E             1.412549,  -3.821452,   1.276637,   1.899782,  -2.389858],
E          [ 11.534231, -22.178034, -12.321628, -52.410096,  62.876084,...

I would just bump the rtol a bit here to 1e-6, and if you know the magnitudes are in the single/double digits then an atol=1e-7 would also be reasonable (could do both).

@Genuster
Copy link
Contributor Author

Genuster commented May 22, 2025

Thanks!
Interesting how it passed macos-13/mamba/3.12, but didn't pass macos-latest/mamba/3.12

It might be that the small difference between filters_ will propagate and increase in patterns_, so rtol/atol won't be much of help for patterns_. But let's see

@larsoner
Copy link
Member

Different architectures, macos-13 is Intel x86_64 and macos-latest is ARM / M1. And Windows also failed, could be use of MKL there or something. I'm cautiously optimistic it's just floating point errors...

@Genuster
Copy link
Contributor Author

Genuster commented Jun 3, 2025

@larsoner, I think I covered tests for the core GEDTransformer cases. Could you check that it's enough and I can move to the next step?

Copy link
Member

@larsoner larsoner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great that the assert statements are passing! Just a few comments below. Also, can you see if you can get closer to 100% coverage here?

https://app.codecov.io/gh/mne-tools/mne-python/pull/13259

Copy link
Member

@larsoner larsoner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a couple more comments.

FYI I modified your top comment to have checkboxes (you can see how it's done if you go to edit it yourself) and a rough plan. Can you see if the plan is what you have in mind and update accordingly if needed? Then I can see where you (think you) are after your next push, and when you ask "okay to move on" I'll know what you want to do next 😄

@Genuster
Copy link
Contributor Author

Genuster commented Jun 4, 2025

Thanks Eric!

Great that the assert statements are passing! Just a few comments below. Also, can you see if you can get closer to 100% coverage here?
https://app.codecov.io/gh/mne-tools/mne-python/pull/13259

That's a cool tool, like that! Will do

FYI I modified your top comment to have checkboxes (you can see how it's done if you go to edit it yourself) and a rough plan. Can you see if the plan is what you have in mind and update accordingly if needed? Then I can see where you (think you) are after your next push, and when you ask "okay to move on" I'll know what you want to do next 😄

Alright :)

@Genuster
Copy link
Contributor Author

Genuster commented Jul 4, 2025

@larsoner, I think mostly everything I planned and didn't plan to do for this part is done.

About making _XdawnTransformer class public: I moved it to decoding and removed the _. But if I understood you correctly the last time, there are additional steps in making a class public in mne?

@drammock
Copy link
Member

drammock commented Jul 4, 2025

About making _XdawnTransformer class public: I moved it to decoding and removed the _. But if I understood you correctly the last time, there are additional steps in making a class public in mne?

It will need to be added to mne.decoding.__init__.pyi (to be importable under mne.decoding.XdawnTransformer), and also to doc/api/decoding.rst (to be cross-referencable in our documentation)

@Genuster
Copy link
Contributor Author

@larsoner, @drammock, I can't make sense of the failures in inverse_sparse tests, do you think it's somehow related to my changes in the code? Other than that I think the PR is ready!

@larsoner larsoner marked this pull request as ready for review July 11, 2025 20:05
@larsoner
Copy link
Member

Pushed #13315 (which was green) and merged the changes into this branch, if it doesn't come back green then it suggests there is something odd about this branch but I'd be surprised. I should be able to look Monday!

@larsoner
Copy link
Member

... I also clicked the "Ready for Review" button and changed the title

@larsoner larsoner changed the title WIP: Add GED transformer ENH: Add GED transformer Jul 11, 2025
@larsoner larsoner added this to the 1.11 milestone Jul 11, 2025
@Genuster
Copy link
Contributor Author

@larsoner, I checked in the last commits whether this MxNE-related failure is the only problem by skipping it - tests came back green

@larsoner
Copy link
Member

@Genuster can you replicate locally? I can't. If someone can, then they could git bisect or revert bits of changes to figure out what's causing the breakage. I'm not sure what it could be!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants