Skip to content

ValueError: betas must be either both floats or both Tensors #638

@hungdaqq

Description

@hungdaqq
Traceback (most recent call last):
  File "/kaggle/working/GFPGAN/gfpgan/train.py", line 11, in <module>
    train_pipeline(root_path)
  File "/usr/local/lib/python3.12/dist-packages/basicsr/train.py", line 124, in train_pipeline
    model = build_model(opt)
            ^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/basicsr/models/__init__.py", line 26, in build_model
    model = MODEL_REGISTRY.get(opt['model_type'])(opt)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/kaggle/working/GFPGAN/gfpgan/models/gfpgan_model.py", line 39, in __init__
    self.init_training_settings()
  File "/kaggle/working/GFPGAN/gfpgan/models/gfpgan_model.py", line 147, in init_training_settings
    self.setup_optimizers()
  File "/kaggle/working/GFPGAN/gfpgan/models/gfpgan_model.py", line 165, in setup_optimizers
    self.optimizer_g = self.get_optimizer(optim_type, optim_params_g, lr, betas=betas)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/basicsr/models/base_model.py", line 105, in get_optimizer
    optimizer = torch.optim.Adam(params, lr, **kwargs)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/dist-packages/torch/optim/adam.py", line 72, in __init__
    raise ValueError("betas must be either both floats or both Tensors")
ValueError: betas must be either both floats or both Tensors

I’m trying to train GFPGAN on Kaggle, but I keep hitting this error during optimizer initialization.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions