Skip to content

Expand Einsum test coverage with parameterized tests#5285

Open
us wants to merge 1 commit intogoogle:mainfrom
us:worktree-linear-test
Open

Expand Einsum test coverage with parameterized tests#5285
us wants to merge 1 commit intogoogle:mainfrom
us:worktree-linear-test

Conversation

@us
Copy link
Contributor

@us us commented Feb 26, 2026

What does this PR do?

Expands nnx.Einsum test coverage in tests/nnx/nn/linear_test.py with a comprehensive TestEinsum class containing 15 test methods (35 parameterized variants). The existing Einsum tests only covered Linen/NNX equivalence and a custom einsum_op — this PR adds functional unit tests for all major Einsum behaviors.

New test coverage

  • Output shape (test_output_shape): 5 diverse einsum patterns — matrix multiply, batched matmul, multi-head attention-like, multi-dim contraction, and bias-free
  • Dtype handling (test_dtypes): 12 combinations of dtype (incl. None), param_dtype, and preferred_element_type with correct output dtype assertions
  • No-bias path (test_no_bias): Parameterized over dtype x param_dtype, verifies analytic expected output
  • Bias broadcast (test_bias_addition_simple, test_bias_addition_multidim, test_bias_broadcast_with_ellipsis): Verifies _infer_broadcasted_bias_shape directly and functionally, including ellipsis equations
  • Input validation (test_einsum_str_missing_arrow_raises, test_einsum_str_wrong_operand_count_raises): assertRaisesRegex with error message matching
  • Space stripping (test_spaces_in_einsum_str): Numeric comparison between spaced and clean einsum strings
  • Precision forwarding (test_precision): Injection-based test verifying precision is passed to einsum_op
  • Gradient flow (test_gradient_flow): Analytic gradient verification for both bias and no-bias cases
  • Call-time override (test_einsum_str_call_override): Proves first_from priority with different constructor vs call-time strings
  • Custom initializers (test_custom_initializers): Verifies init values and forward pass analytic output
  • Ellipsis patterns (test_ellipsis_einsum_str): Shape verification for d...ab,bc...->ad...c

Checklist

  • This PR fixes a minor issue (e.g.: typo or small bug) or improves the docs (you can dismiss the other checks if that's the case).
  • This change is discussed in a Github issue/discussion (please add a link).
  • The documentation and docstrings adhere to the documentation guidelines.
  • This change includes necessary high-coverage tests. (No quality testing = no merge!)

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly enhances the test coverage for nnx.Einsum by introducing a new, comprehensive test suite. The added tests validate various aspects of the Einsum layer, including its behavior with different input/output shapes, data types, bias configurations, input validation, precision handling, gradient computation, and custom initializers. This expansion ensures the robustness and correctness of the nnx.Einsum implementation across a wide range of use cases.

Highlights

  • Expanded Output Shape Tests: Added tests for nnx.Einsum output shapes covering five diverse einsum patterns, including matrix multiply, batched matmul, multi-head attention-like, multi-dimensional contraction, and bias-free operations.
  • Comprehensive Dtype Handling Tests: Introduced tests for various combinations of dtype, param_dtype, and preferred_element_type to ensure correct output dtype assertions.
  • No-Bias Path Verification: Implemented parameterized tests for the no-bias path, verifying analytic expected output across different data types.
  • Bias Broadcast Logic Validation: Added tests to verify _infer_broadcasted_bias_shape directly and functionally, including complex ellipsis equations for bias addition and broadcasting.
  • Input Validation Tests: Included tests to ensure nnx.Einsum raises appropriate ValueError exceptions for missing arrows or incorrect operand counts in the einsum string.
  • Einsum String Space Stripping: Verified that spaces within the einsum string are correctly handled and do not affect the numeric output.
  • Precision Forwarding Verification: Implemented an injection-based test to confirm that precision settings are correctly passed to the underlying einsum_op.
  • Gradient Flow Analysis: Added analytic gradient verification tests for both bias and no-bias cases to ensure correct gradient computation.
  • Call-Time Einsum String Override: Confirmed that the einsum_str provided at call-time correctly overrides the one specified during constructor initialization.
  • Custom Initializers Support: Verified that custom kernel and bias initializers are correctly applied and affect the forward pass analytic output.
  • Ellipsis Patterns Shape Verification: Included tests to verify correct shape inference for einsum strings utilizing ellipsis patterns.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • tests/nnx/nn/linear_test.py
    • Added a new TestEinsum class to expand test coverage for nnx.Einsum.
    • Implemented numerous parameterized test methods within TestEinsum to cover output shape, dtype handling, bias behavior, input validation, precision, gradient flow, and custom initializers.
Activity
  • No human activity has been recorded on this pull request yet.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request significantly expands the test coverage for nnx.Einsum with a comprehensive new test suite. The tests are thorough, covering output shapes, dtypes, bias handling, input validation, precision, gradients, and more. The implementation is of high quality. I have a couple of suggestions to improve the test code's readability and maintainability.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant