We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Pytorch XLA/PJRT TPU support for bitsandbytes
Would allow for faster and more memory efficient training of models on TPUs.
Happy to provide TPUs.