[HuggingFace][Neuronx] Training - DLC for Optimum-neuron 0.3.0 - Neuron SDK 2.24.1 PyTorch 2.7.1 - Transformers 4.51.3#5292
Conversation
Optimum Neuron: v0.3.0 Neuron SDK: v2.24.0 Pytorch: v2.7.0
| "tensorboard>=2.11.0" \ | ||
| "numpy>=1.24.3,<=1.25.2" \ | ||
| "numba==0.58.1" \ | ||
| "Pillow==10.3.0" \ |
There was a problem hiding this comment.
pillow is getting installed before too
|
|
||
|
|
||
|
|
||
| # ------------------------------------------------------------ |
There was a problem hiding this comment.
why do we need to separate this? We can move all ARG to above, and combine the PIP install commands.
There was a problem hiding this comment.
The reason is that the image for SDK was not published at the time I submitted the PR, see the comments here https://github.com/tengomucho/deep-learning-containers/blob/f25c8ecb5ade64b78985628e262b1d012be6283d/huggingface/pytorch/training/docker/2.7/py3/sdk2.24.1/Dockerfile.neuronx#L1.
Let me use that image now that is available, it will make the dockerfile leaner.
| # Pin numpy to version required by neuronx-cc | ||
| # Update Pillow, urllib, wandb versions to fix high and critical vulnerabilities | ||
| RUN pip install -U \ | ||
| "tensorboard>=2.11.0" \ |
There was a problem hiding this comment.
tensorboard is also duplicate. please combine all of them
| # To fix that, we are downgrading networkx to 2.6.3 | ||
| RUN pip install -U "networkx==2.6.3" | ||
|
|
||
| RUN apt-get update \ |
There was a problem hiding this comment.
Do all APT installs in the beginning
This is to simplify dockerfile.
038d1d1 to
64854ad
Compare
There was a problem hiding this comment.
can you explain what is this doing?
fix test_stray_files failure and install pytorch-lightning
Change numpy and torchvision versions
Issue #5273
transformers: 4.51.3torch: 2.7.1diffusers: 0.35.1peft: 0.17.0Note:
If merging this PR should also close the associated Issue, please also add that Issue # to the Linked Issues section on the right.
All PR's are checked weekly for staleness. This PR will be closed if not updated in 30 days.
Description
Tests Run
By default, docker image builds and tests are disabled. Two ways to run builds and tests:
How to use the helper utility for updating dlc_developer_config.toml
Assuming your remote is called
origin(you can find out more withgit remote -v)...python src/prepare_dlc_dev_environment.py -b </path/to/buildspec.yml> -cp originpython src/prepare_dlc_dev_environment.py -b </path/to/buildspec.yml> -t sanity_tests -cp originpython src/prepare_dlc_dev_environment.py -rcp originNOTE: If you are creating a PR for a new framework version, please ensure success of the local, standard, rc, and efa sagemaker tests by updating the dlc_developer_config.toml file:
sagemaker_remote_tests = truesagemaker_efa_tests = truesagemaker_rc_tests = truesagemaker_local_tests = trueHow to use PR description
Use the code block below to uncomment commands and run the PR CodeBuild jobs. There are two commands available:# /buildspec <buildspec_path># /buildspec pytorch/training/buildspec.yml# /tests <test_list># /tests sanity security ec2sanity, security, ec2, ecs, eks, sagemaker, sagemaker-local.Formatting
black -l 100on my code (formatting tool: https://black.readthedocs.io/en/stable/getting_started.html)PR Checklist
Expand
Pytest Marker Checklist
Expand
@pytest.mark.model("<model-type>")to the new tests which I have added, to specify the Deep Learning model that is used in the test (use"N/A"if the test doesn't use a model)@pytest.mark.integration("<feature-being-tested>")to the new tests which I have added, to specify the feature that will be tested@pytest.mark.multinode(<integer-num-nodes>)to the new tests which I have added, to specify the number of nodes used on a multi-node test@pytest.mark.processor(<"cpu"/"gpu"/"eia"/"neuron">)to the new tests which I have added, if a test is specifically applicable to only one processor typeBy submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license. I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.