Skip to content

Conversation

@prashantpandeygit
Copy link

Fixes issue where loading safetensors files with float8_e5m2 (and other float8) dtypes fails with 'AttributeError: module numpy has no attribute float8_e5m2' when using PaddlePaddle < 3.2.0.

The issue occurs because when paddle.version < 3.2.0, load_file() uses numpy.load_file() which calls safe_open with framework='np'. The Rust code then tries to access numpy.float8_e5m2 which doesn't exist since NumPy doesn't have native float8 types.

Fix: Map float8 types (F8_E4M3, F8_E5M2, F8_E8M0, F4) to np.uint8 when is_numpy=True, similar to how bfloat16 is handled differently for numpy vs other frameworks. This matches the existing pattern in paddle.py where float8 types are mapped to np.uint8 (same width, byteswap is no-op).

Changes

  • Updated get_pydtype() in lib.rs to check is_numpy for float8 types
  • Added test case test_float8_e5m2_loading() to verify the fix

Fix #682

Fixes issue where loading safetensors files with float8_e5m2 (and other float8)
dtypes fails with 'AttributeError: module numpy has no attribute float8_e5m2'
when using PaddlePaddle < 3.2.0.

The issue occurs because when paddle.__version__ < 3.2.0, load_file() uses
numpy.load_file() which calls safe_open with framework='np'. The Rust code
then tries to access numpy.float8_e5m2 which doesn't exist since NumPy doesn't
have native float8 types.

Fix: Map float8 types (F8_E4M3, F8_E5M2, F8_E8M0, F4) to np.uint8 when
is_numpy=True, similar to how bfloat16 is handled differently for numpy vs
other frameworks. This matches the existing pattern in paddle.py where float8
types are mapped to np.uint8 (same width, byteswap is no-op).

Fix huggingface#682
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[paddlepaddle] module 'numpy' has no attribute 'float8_e5m2'

1 participant