Fix/20260 layers property setter#22359
Fix/20260 layers property setter#22359pctablet505 wants to merge 4 commits intokeras-team:masterfrom
Conversation
…s-team#20260) Add a warning in Layer.__setattr__ when user code reassigns reserved tracked attributes (_layers, _metrics, _trainable_variables, etc.) that are managed by Keras' internal tracking system. Overriding these silently breaks weight saving/loading because the save mechanism skips them via get_attr_skipset. Sequential and Functional models are exempted since they legitimately manage _layers internally.
Summary of ChangesHello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request enhances the robustness of Keras by preventing subtle data corruption issues that arise when users inadvertently override internal attributes within Highlights
🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console. Changelog
Activity
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request introduces a valuable safeguard in the Layer class by adding a warning when a user's code reassigns internal reserved attributes like _layers or _metrics. This change effectively prevents a class of subtle bugs related to weight saving and metric tracking, where such reassignments could silently break functionality. The implementation in __setattr__ is well-targeted, correctly identifying reassignments in user code while exempting legitimate internal uses within Keras modules like Sequential and Functional. The accompanying tests are thorough, verifying that the warning is triggered for custom layers and correctly suppressed for exempt internal classes. The changes are clear, correct, and significantly improve the developer experience by providing actionable feedback on a common pitfall.
When deciding whether to suppress the reserved-attribute warning, use the caller's module (inspect.currentframe().f_back) instead of type(self).__module__. This ensures the warning is properly exempted based on the caller module listed in _RESERVED_ATTR_EXEMPT_MODULES, preventing incorrect suppression/emission. Also remove a stray 'ss_input' token from keras/api/applications/densenet/__init__.py.
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## master #22359 +/- ##
==========================================
+ Coverage 82.95% 82.96% +0.01%
==========================================
Files 595 596 +1
Lines 66040 66276 +236
Branches 10305 10323 +18
==========================================
+ Hits 54785 54989 +204
- Misses 8639 8667 +28
- Partials 2616 2620 +4
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
hertschuh
left a comment
There was a problem hiding this comment.
This seems very ad-hoc and brittle. We should find a better way to fix this.
I think a simpler and cleaner criteria that doesn't involve hardcoded _RESERVED_LAYER_ATTRIBUTES and hardcoded _RESERVED_ATTR_EXEMPT_MODULES is to check whether what you're reassigning is one of the lists that the tracker itself uses (in it's config). Maybe we can cache the attribute name in the config too to be able to check quickly. And we'll need to pass the attribute name to _tracker.track(), which we'll need anyway to fix #18601
Fixes #20260
Problem
Keras internally uses tracking attributes (e.g.
_layers,_metrics,_trainable_variables) inside theLayerclass. If a user subclass explicitly assigns to them (self._layers = [...]), the internal tracker's list is silently wiped. This leads to subtle bugs—such as model weights not saving properly—and crucially, gives no warning or indication of failure to the developer.Solution
This PR adds a safety mechanism in
Layer.__setattr__that explicitly warns users if they overwrite any Keras tracking attributes.Key Improvements:
_RESERVED_LAYER_ATTRIBUTESand an exempt module list_RESERVED_ATTR_EXEMPT_MODULESto specify valid internal callers (e.g.SequentialorFunctional).inspect.currentframe().f_back. This correctly warns offending user assignments while allowing valid inherited/internal configurations (likeSequential.__init__) to bypass safety without triggering false positive warnings.BadLayeroverriding_layersemits a structuredUserWarning, andSequentialinitializes cleanly.Testing:
test_reserved_attribute_warningtolayer_test.py.saving_lib_test.pyensuring that standard implementations ofSequentialno longer inadvertently emit alerts during saves mapping.