Skip to content

Fix/20260 layers property setter#22359

Open
pctablet505 wants to merge 4 commits intokeras-team:masterfrom
pctablet505:fix/20260-layers-property-setter
Open

Fix/20260 layers property setter#22359
pctablet505 wants to merge 4 commits intokeras-team:masterfrom
pctablet505:fix/20260-layers-property-setter

Conversation

@pctablet505
Copy link
Collaborator

@pctablet505 pctablet505 commented Mar 5, 2026

Fixes #20260

Problem

Keras internally uses tracking attributes (e.g. _layers, _metrics, _trainable_variables) inside the Layer class. If a user subclass explicitly assigns to them (self._layers = [...]), the internal tracker's list is silently wiped. This leads to subtle bugs—such as model weights not saving properly—and crucially, gives no warning or indication of failure to the developer.

Solution

This PR adds a safety mechanism in Layer.__setattr__ that explicitly warns users if they overwrite any Keras tracking attributes.

Key Improvements:

  • Reserved Attributes Tracking: Added _RESERVED_LAYER_ATTRIBUTES and an exempt module list _RESERVED_ATTR_EXEMPT_MODULES to specify valid internal callers (e.g. Sequential or Functional).
  • Dynamic Warning System: When an attribute is overridden, the check dynamically inspects the caller module via inspect.currentframe().f_back. This correctly warns offending user assignments while allowing valid inherited/internal configurations (like Sequential.__init__) to bypass safety without triggering false positive warnings.
  • Tests Included: Validates BadLayer overriding _layers emits a structured UserWarning, and Sequential initializes cleanly.

Testing:

  • Verified by adding test_reserved_attribute_warning to layer_test.py.
  • Checked against saving_lib_test.py ensuring that standard implementations of Sequential no longer inadvertently emit alerts during saves mapping.

…s-team#20260)

Add a warning in Layer.__setattr__ when user code reassigns reserved
tracked attributes (_layers, _metrics, _trainable_variables, etc.)
that are managed by Keras' internal tracking system. Overriding these
silently breaks weight saving/loading because the save mechanism skips
them via get_attr_skipset.

Sequential and Functional models are exempted since they legitimately
manage _layers internally.
@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request enhances the robustness of Keras by preventing subtle data corruption issues that arise when users inadvertently override internal attributes within Layer subclasses. By introducing a warning mechanism and carefully defined exemptions, it guides users toward correct usage patterns without disrupting Keras's internal operations, thereby improving the reliability of model saving, loading, and metric tracking.

Highlights

  • Reserved Attribute Protection: Introduced a safeguard in the Keras Layer class to warn users when they override internal reserved attributes like _layers or _metrics, which could lead to silent failures in weight saving/loading and metric tracking.
  • Exemption Mechanism: Defined _RESERVED_LAYER_ATTRIBUTES to list protected names and _RESERVED_ATTR_EXEMPT_MODULES to allow internal Keras modules (e.g., Sequential, Functional) to legitimately modify these attributes without triggering warnings.
  • Warning Implementation: Modified the __setattr__ method in the Layer class to emit a UserWarning if a user-defined subclass attempts to assign to a reserved attribute after the tracker is initialized, ensuring proper attribute tracking.
  • Comprehensive Testing: Added a new test case, test_reserved_attribute_warning, to verify that the warning is correctly triggered for user-defined layers that override reserved attributes, and that internal Keras classes like Sequential do not produce false positives.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • keras/src/layers/layer.py
    • Added _RESERVED_LAYER_ATTRIBUTES frozenset to list attributes protected from user reassignment.
    • Added _RESERVED_ATTR_EXEMPT_MODULES frozenset to specify internal modules that are allowed to reassign reserved attributes.
    • Implemented warning logic within __setattr__ to detect and warn against user attempts to reassign reserved attributes.
  • keras/src/layers/layer_test.py
    • Added test_reserved_attribute_warning to confirm that user-defined layers overriding reserved attributes trigger a warning.
    • Included a check to ensure that internal Keras classes like Sequential do not trigger the reserved attribute warning.
Activity
  • No specific activity (comments, reviews, progress updates) was provided in the context.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a valuable safeguard in the Layer class by adding a warning when a user's code reassigns internal reserved attributes like _layers or _metrics. This change effectively prevents a class of subtle bugs related to weight saving and metric tracking, where such reassignments could silently break functionality. The implementation in __setattr__ is well-targeted, correctly identifying reassignments in user code while exempting legitimate internal uses within Keras modules like Sequential and Functional. The accompanying tests are thorough, verifying that the warning is triggered for custom layers and correctly suppressed for exempt internal classes. The changes are clear, correct, and significantly improve the developer experience by providing actionable feedback on a common pitfall.

When deciding whether to suppress the reserved-attribute warning, use the caller's module (inspect.currentframe().f_back) instead of type(self).__module__. This ensures the warning is properly exempted based on the caller module listed in _RESERVED_ATTR_EXEMPT_MODULES, preventing incorrect suppression/emission. Also remove a stray 'ss_input' token from keras/api/applications/densenet/__init__.py.
@pctablet505 pctablet505 marked this pull request as ready for review March 6, 2026 05:57
@codecov-commenter
Copy link

codecov-commenter commented Mar 6, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 82.96%. Comparing base (95e74a9) to head (eab08ed).
⚠️ Report is 8 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master   #22359      +/-   ##
==========================================
+ Coverage   82.95%   82.96%   +0.01%     
==========================================
  Files         595      596       +1     
  Lines       66040    66276     +236     
  Branches    10305    10323      +18     
==========================================
+ Hits        54785    54989     +204     
- Misses       8639     8667      +28     
- Partials     2616     2620       +4     
Flag Coverage Δ
keras 82.79% <100.00%> (+0.01%) ⬆️
keras-jax 60.83% <100.00%> (+<0.01%) ⬆️
keras-numpy 55.04% <100.00%> (+0.01%) ⬆️
keras-openvino 49.06% <100.00%> (-0.05%) ⬇️
keras-tensorflow 62.06% <100.00%> (-0.01%) ⬇️
keras-torch 60.88% <100.00%> (+<0.01%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Collaborator

@hertschuh hertschuh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems very ad-hoc and brittle. We should find a better way to fix this.

I think a simpler and cleaner criteria that doesn't involve hardcoded _RESERVED_LAYER_ATTRIBUTES and hardcoded _RESERVED_ATTR_EXEMPT_MODULES is to check whether what you're reassigning is one of the lists that the tracker itself uses (in it's config). Maybe we can cache the attribute name in the config too to be able to check quickly. And we'll need to pass the attribute name to _tracker.track(), which we'll need anyway to fix #18601

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Model weights not saved if a custom layer class contains a list of layers named self._layers

4 participants