Mixed Precision
#3138
Replies: 1 comment 1 reply
-
Hi @kimihailv, |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
As I understand, there isn't ready mixed precision infrastructure in Flax as in PyTorch? I found the usage of
dynamic_scale
in ImageNet example, but I couldn't find any documentations. Does it exist? And what is the best way to train a flax model with mixed precision (float32 and bfloat16) on gpu?Beta Was this translation helpful? Give feedback.
All reactions