Skip to content

Model load errors suppressed, leading to UnboundLocalError #166

@GregoryComer

Description

@GregoryComer

When the _load_eager_pretrained call fails in optimum/exporters/executorch/tasks/causal_lm.py, the error is caught. There is some logic to handle SDPA errors (see below), but if the error doesn't match this, it is silently dropped. The program continues forward and hits an UnboundLocalError when trying to access the loaded model. This hides the actual error, which gives useful information.

except ValueError as e:
if "torch.nn.functional.scaled_dot_product_attention" in str(e):
logging.info("⚠ SDPA attention not supported, falling back to eager implementation")
attn_implementation = "eager"
eager_model = _load_eager_pretrained(
model_name_or_path,
device,
dtype,
config,
attn_implementation,
cache_implementation,
batch_size,
max_length,
)

Example error:

Traceback (most recent call last):
  File "/home/gjcomer/.conda/envs/et-rc3/bin/optimum-cli", line 7, in <module>
    sys.exit(main())
             ^^^^^^
  File "/home/gjcomer/.conda/envs/et-rc3/lib/python3.12/site-packages/optimum/commands/optimum_cli.py", line 208, in main
    service.run()
  File "/home/gjcomer/.conda/envs/et-rc3/lib/python3.12/site-packages/optimum/commands/export/executorch.py", line 181, in run
    main_export(
  File "/home/gjcomer/.conda/envs/et-rc3/lib/python3.12/site-packages/optimum/exporters/executorch/__main__.py", line 138, in main_export
    model = task_func(model_name_or_path, **kwargs)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/gjcomer/.conda/envs/et-rc3/lib/python3.12/site-packages/optimum/exporters/executorch/tasks/causal_lm.py", line 131, in load_causal_lm_model
    for param in eager_model.parameters():
                 ^^^^^^^^^^^
UnboundLocalError: cannot access local variable 'eager_model' where it is not associated with a value

In this case, updating the optimum-executorch code to propagate the underlying exception showed the real error:

ValueError: Unrecognized configuration class <class 'transformers.models.idefics3.configuration_idefics3.Idefics3Config'> for this kind of AutoModel: AutoModelForCausalLM.

Suggested Resolution

Re-throw or display the error in the except block above.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions