Skip to content

switch from platform markers for torch to uv extras for more flexible install#440

Open
rbavery wants to merge 6 commits intoallenai:mainfrom
rbavery:ryan/replace-platform-req-wth-extras
Open

switch from platform markers for torch to uv extras for more flexible install#440
rbavery wants to merge 6 commits intoallenai:mainfrom
rbavery:ryan/replace-platform-req-wth-extras

Conversation

@rbavery
Copy link
Contributor

@rbavery rbavery commented Nov 10, 2025

This PR implements uv extras for separating cpu from cuda dependencies instead of platform restrictions. It allows for installing cpu only dependencies on linux, which is useful in CI and for depending on olmo_earth in projects that only install cpu dependencies.

I've tested this out how to use olmoearth_pretrain as a dependency in another project.


Note

Switches dependency management to uv extras for selecting CPU vs CUDA torch/torchvision, updates README install instructions, and configures uv sources/conflicts with minor dependency tweaks.

  • Installation:
    • Update README to show uv sync with --extra torch-cu128 or --extra torch-cpu.
  • Dependency/packaging:
    • Introduce uv extras torch-cpu and torch-cu128 with corresponding torch, torchvision, pytorch-triton, and flash-attn wiring.
    • Add [tool.uv] config: extras conflict guard and custom indexes/sources for torch, torchvision, and pytorch-triton.
    • Pin markupsafe<=3.0.0 and adjust metadata accordingly.
    • Move flash-attn handling to tool.uv.extra-build-*.
    • No code changes; lockfile updated to reflect new extras and indexes.

Written by Cursor Bugbot for commit 092ae7b. This will update automatically on new commits. Configure here.

…o torch-cu128 extra, torch upper bound to 2.10
…stall issue by requiring wheels for linux x86_64
@rbavery rbavery marked this pull request as draft November 10, 2025 23:45
@rbavery rbavery marked this pull request as ready for review November 13, 2025 19:55
@Hgherzog
Copy link
Collaborator

Hi Ryan, I think making the torch installation explicit rather than platform dependent is very sensible. I will need to test it on our internal cluster as well before we can merge this. We should hold off on the torch version bump for now and just address the platform dependencies in this PR, as we would want to run more extensive tests on our internal hardware before doing that. Are you looking to use this package to load the model or build off of the pre-training code?

…very/olmoearth_pretrain into ryan/replace-platform-req-wth-extras
@rbavery
Copy link
Contributor Author

rbavery commented Dec 1, 2025

@Hgherzog Got it I removed the torch version bump.

We're exporting the model as PT2 archives (successor format to TorchScript). In order to export as PT2, the version bump would be helpful as newer versions of torch have been improving the PT2 archive format and export utilities. I can use my fork for this work instead though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants