You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
/Users/jyotindersingh/miniconda3/envs/keras-io-env-3.12/lib/python3.12/site-packages/keras/src/models/model.py:472: UserWarning: Layer InputLayer does not have a `quantize` method implemented.
549
555
warnings.warn(str(e))
@@ -562,8 +568,8 @@ Here are concrete patterns you can reuse when making your own layers PTQ-friendl
562
568
- The axis you packed along (e.g., `_int4_pack_axis`).
563
569
- The original (unpacked) length on that axis (e.g., `_original_input_dim` or
564
570
`_original_length_along_pack_axis`).
565
-
- In `call(...)`, compute with the quantized buffers and de-scale back to float
566
-
at the end, wherever possible. This allows you to leverage optimized
571
+
- In quantized call hooks, compute with the quantized buffers and de-scale back
572
+
to float at the end, wherever possible. This allows you to leverage optimized
0 commit comments