Conversation
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
muellerzr
left a comment
There was a problem hiding this comment.
Thanks a bunch! From the accelerate side this looks fine; do you want to do big model inference while you're at it? (if not all good).
I assume accelerate test etc went well? 🤗
|
@muellerzr I forgot to mark it as a draft 😅. |
|
No worries @IlyasMoutawwakil ! Just let us know when you're all set to go 🫡 |
acc6b01 to
81a37be
Compare
|
All tests that don't require fp16/fp8 are passing on gaudi1.
one last test that fails with no explanation is |
.github/workflows/gaudi1.yml
Outdated
| pull_request: | ||
| branches: | ||
| - main |
There was a problem hiding this comment.
will be removed and only schedule will stay before merge.
| # is_fp8_available only checks for libraries | ||
| # ideally it should check for device capability as well | ||
| fp8_is_available = is_fp8_available() |
There was a problem hiding this comment.
it seems that is_fp8_available() only checks for libraries availability and not for device capability.
check_fp8_capability is used for that but that's confusing no ? because is_fp16/bf16 do check for device capability of using those dtypes.
There was a problem hiding this comment.
I didn't change the behavior of is_fp8_available() to not break bc but it would make sense to have a single source of truth.
There was a problem hiding this comment.
I can see the logic, I'll change it in a follow up and can see why that'd be confusing
What does this PR do?
This PR introduces upstream support for HPU torch device/backend:
This is part of three PRs:
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.