Skip to content

A convert BUG! #992

Open
Open
@zzhdbw

Description

@zzhdbw

my torch version is 2.6

when i run script:
xtuner convert pth_to_hf XX XX XX

it will have a error, and i think its about torch2.6:

Traceback (most recent call last): File "/home/zzh/miniforge3/envs/xtuner-env/lib/python3.10/site-packages/xtuner/tools/model_converters/pth_to_hf.py", line 139, in <module> main() File "/home/zzh/miniforge3/envs/xtuner-env/lib/python3.10/site-packages/xtuner/tools/model_converters/pth_to_hf.py", line 112, in main state_dict = guess_load_checkpoint(args.pth_model) File "/home/zzh/miniforge3/envs/xtuner-env/lib/python3.10/site-packages/xtuner/model/utils.py", line 313, in guess_load_checkpoint state_dict = get_state_dict_from_zero_checkpoint( File "/home/zzh/miniforge3/envs/xtuner-env/lib/python3.10/site-packages/xtuner/utils/zero_to_any_dtype.py", line 617, in get_state_dict_from_zero_checkpoint return _get_state_dict_from_zero_checkpoint(ds_checkpoint_dir, File "/home/zzh/miniforge3/envs/xtuner-env/lib/python3.10/site-packages/xtuner/utils/zero_to_any_dtype.py", line 229, in _get_state_dict_from_zero_checkpoint zero_stage, world_size, flat_groups = parse_optim_states( File "/home/zzh/miniforge3/envs/xtuner-env/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context return func(*args, **kwargs) File "/home/zzh/miniforge3/envs/xtuner-env/lib/python3.10/site-packages/xtuner/utils/zero_to_any_dtype.py", line 168, in parse_optim_states state_dict = torch.load(f, map_location=device) File "/home/zzh/miniforge3/envs/xtuner-env/lib/python3.10/site-packages/torch/serialization.py", line 1470, in load raise pickle.UnpicklingError(_get_wo_message(str(e))) from None _pickle.UnpicklingError: Weights only load failed. This file can still be loaded, to do so you have two options, do those steps only if you trust the source of the checkpoint. (1) In PyTorch 2.6, we changed the default value of the weights_onlyargument intorch.loadfromFalsetoTrue. Re-running torch.loadwithweights_onlyset toFalsewill likely succeed, but it can result in arbitrary code execution. Do it only if you got the file from a trusted source. (2) Alternatively, to load withweights_only=Trueplease check the recommended steps in the following error message. WeightsUnpickler error: Unsupported global: GLOBAL deepspeed.runtime.fp16.loss_scaler.LossScaler was not an allowed global by default. Please usetorch.serialization.add_safe_globals([LossScaler])or thetorch.serialization.safe_globals([LossScaler])` context manager to allowlist this global if you trust this class/function.

Check the documentation of torch.load to learn more about types accepted by default with weights_only https://pytorch.org/docs/stable/generated/torch.load.html.`

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions