-
Notifications
You must be signed in to change notification settings - Fork 868
Open
Description
Describe the bug
When training the EfficientAD model in Kaggle, a RecursionError: maximum recursion depth exceeded occurs in the rich library console printing mechanism. The error happens during the pre-training weights download phase when the tqdm progress bar attempts to write output.
Dataset
Folder
Model
Other (please specify in the field below)
Steps to reproduce the behavior
- Initialize an EfficientAD model with Anomalib
- Start the training process
- Error occurs during
prepare_pretrained_model()when downloading weights
OS information
OS information:Kaggle
Expected behavior
The EfficientAD model should:
Successfully download the pretrained weights from the specified URL using the download_and_extract function
Screenshots
No response
Pip/GitHub
pip
What version/branch did you use?
2.2.0
Configuration YAML
model = EfficientAd(
model_size=EfficientAdModelSize.S,
teacher_out_channels=384, # Number of teacher output channels
padding=False,
pad_maps=True,
lr=1e-4,
)
early_stopping = EarlyStopping(
monitor="train_loss_step",
mode="min",
patience=8,
check_finite=True
)
engine = Engine(
max_epochs=100,
#callbacks=[early_stopping],
)
engine.fit(datamodule=datamodule, model=model)Logs
/usr/local/lib/python3.10/dist-packages/timm/models/layers/__init__.py:48: FutureWarning: Importing from timm.models.layers is deprecated, please import via timm.layers
warnings.warn(f"Importing from {__name__} is deprecated, please import via timm.layers", FutureWarning)
┏━━━┳━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━┳━━━━━━━━┳━━━━━━━┳━━━━━━━┓
┃ ┃ Name ┃ Type ┃ Params ┃ Mode ┃ FLOPs ┃
┡━━━╇━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━╇━━━━━━━━╇━━━━━━━╇━━━━━━━┩
│ 0 │ pre_processor │ PreProcessor │ 0 │ train │ 0 │
│ 1 │ post_processor │ PostProcessor │ 0 │ train │ 0 │
│ 2 │ evaluator │ Evaluator │ 0 │ train │ 0 │
│ 3 │ model │ EfficientAdModel │ 8.1 M │ train │ 0 │
└───┴────────────────┴──────────────────┴────────┴───────┴───────┘
Trainable params: 8.1 M
Non-trainable params: 0
Total params: 8.1 M
Total estimated model params size (MB): 32
Modules in train mode: 48
Modules in eval mode: 7
Total FLOPs: 0
Loading widget...
/usr/local/lib/python3.10/dist-packages/lightning/pytorch/loops/fit_loop.py:534: Found 7 module(s) in eval mode at the start of training. This may lead to unexpected behavior during training. If this is intentional, you can ignore this warning.
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
efficientad_pretrained_weights.zip: 0.00B [00:00, ?B/s]
---------------------------------------------------------------------------
RecursionError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/rich/console.py in print(self, sep, end, style, justify, overflow, no_wrap, emoji, markup, highlight, width, height, crop, soft_wrap, new_line_start, *objects)
1704 for renderable in renderables:
-> 1705 extend(render(renderable, render_options))
1706 else:
/usr/local/lib/python3.10/dist-packages/rich/console.py in render(self, renderable, options)
1325 _options = _options.reset_height()
-> 1326 for render_output in iter_render:
1327 if isinstance(render_output, _Segment):
/usr/local/lib/python3.10/dist-packages/rich/text.py in __rich_console__(self, console, options)
705 all_lines = Text("\n").join(lines)
--> 706 yield from all_lines.render(console, end=self.end)
707
/usr/local/lib/python3.10/dist-packages/rich/text.py in render(self, console, end)
774 if next_offset > offset:
--> 775 yield _Segment(text[offset:next_offset], get_current_style())
776 if end:
/usr/local/lib/python3.10/dist-packages/rich/text.py in get_current_style()
764 return cached_style
--> 765 current_style = combine(styles)
766 style_cache[styles] = current_style
/usr/local/lib/python3.10/dist-packages/rich/style.py in combine(cls, styles)
610 iter_styles = iter(styles)
--> 611 return sum(iter_styles, next(iter_styles))
612
/usr/local/lib/python3.10/dist-packages/rich/style.py in __add__(self, style)
757 def __add__(self, style: Optional["Style"]) -> "Style":
--> 758 combined_style = self._add(style)
759 return combined_style.copy() if combined_style.link else combined_style
/usr/local/lib/python3.10/dist-packages/rich/style.py in __eq__(self, other)
422 def __eq__(self, other: Any) -> bool:
--> 423 if not isinstance(other, Style):
424 return NotImplemented
RecursionError: maximum recursion depth exceeded while calling a Python object
During handling of the above exception, another exception occurred:
RecursionError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/IPython/core/formatters.py in catch_format_error(method, self, *args, **kwargs)
223 try:
--> 224 r = method(self, *args, **kwargs)
225 except NotImplementedError:
/usr/local/lib/python3.10/dist-packages/IPython/core/formatters.py in __call__(self, obj)
905 """Compute the format for an object."""
--> 906 if self.enabled:
907 # lookup registered printer
/usr/local/lib/python3.10/dist-packages/traitlets/traitlets.py in __get__(self, obj, cls)
699 else:
--> 700 return self.get(obj, cls)
701
RecursionError: maximum recursion depth exceeded
During handling of the above exception, another exception occurred:
RecursionError Traceback (most recent call last)
<ipython-input-3-3445dc4b50ba> in <cell line: 111>()
109
110 # 5. Train the model
--> 111 engine.fit(datamodule=datamodule, model=model)
112
113
/usr/local/lib/python3.10/dist-packages/anomalib/engine/engine.py in fit(self, model, train_dataloaders, val_dataloaders, datamodule, ckpt_path)
410 self.trainer.validate(model, val_dataloaders, datamodule=datamodule, ckpt_path=ckpt_path)
411 else:
--> 412 self.trainer.fit(model, train_dataloaders, val_dataloaders, datamodule, ckpt_path)
413
414 def validate(
/usr/local/lib/python3.10/dist-packages/lightning/pytorch/trainer/trainer.py in fit(self, model, train_dataloaders, val_dataloaders, datamodule, ckpt_path, weights_only)
582 self.training = True
583 self.should_stop = False
--> 584 call._call_and_handle_interrupt(
585 self,
586 self._fit_impl,
/usr/local/lib/python3.10/dist-packages/lightning/pytorch/trainer/call.py in _call_and_handle_interrupt(trainer, trainer_fn, *args, **kwargs)
47 if trainer.strategy.launcher is not None:
48 return trainer.strategy.launcher.launch(trainer_fn, *args, trainer=trainer, **kwargs)
---> 49 return trainer_fn(*args, **kwargs)
50
51 except _TunerExitException:
/usr/local/lib/python3.10/dist-packages/lightning/pytorch/trainer/trainer.py in _fit_impl(self, model, train_dataloaders, val_dataloaders, datamodule, ckpt_path, weights_only)
628 model_connected=self.lightning_module is not None,
629 )
--> 630 self._run(model, ckpt_path=ckpt_path, weights_only=weights_only)
631
632 assert self.state.stopped
/usr/local/lib/python3.10/dist-packages/lightning/pytorch/trainer/trainer.py in _run(self, model, ckpt_path, weights_only)
1077 # RUN THE TRAINER
1078 # ----------------------------
-> 1079 results = self._run_stage()
1080
1081 # ----------------------------
/usr/local/lib/python3.10/dist-packages/lightning/pytorch/trainer/trainer.py in _run_stage(self)
1121 self._run_sanity_check()
1122 with torch.autograd.set_detect_anomaly(self._detect_anomaly):
-> 1123 self.fit_loop.run()
1124 return None
1125 raise RuntimeError(f"Unexpected state {self.state}")
/usr/local/lib/python3.10/dist-packages/lightning/pytorch/loops/fit_loop.py in run(self)
211 return
212 self.reset()
--> 213 self.on_run_start()
214 while not self.done:
215 try:
/usr/local/lib/python3.10/dist-packages/lightning/pytorch/loops/fit_loop.py in on_run_start(self)
426
427 call._call_callback_hooks(trainer, "on_train_start")
--> 428 call._call_lightning_module_hook(trainer, "on_train_start")
429 call._call_strategy_hook(trainer, "on_train_start")
430
/usr/local/lib/python3.10/dist-packages/lightning/pytorch/trainer/call.py in _call_lightning_module_hook(trainer, hook_name, pl_module, *args, **kwargs)
175
176 with trainer.profiler.profile(f"[LightningModule]{pl_module.__class__.__name__}.{hook_name}"):
--> 177 output = fn(*args, **kwargs)
178
179 # restore current_fx when nested context
/usr/local/lib/python3.10/dist-packages/anomalib/models/image/efficient_ad/lightning_model.py in on_train_start(self)
375 sample = next(iter(self.trainer.train_dataloader))
376 image_size = sample.image.shape[-2:]
--> 377 self.prepare_pretrained_model()
378 self.prepare_imagenette_data(image_size)
379 if not self.model.is_set(self.model.mean_std):
/usr/local/lib/python3.10/dist-packages/anomalib/models/image/efficient_ad/lightning_model.py in prepare_pretrained_model(self)
162 pretrained_models_dir = Path("./pre_trained/")
163 if not (pretrained_models_dir / "efficientad_pretrained_weights").is_dir():
--> 164 download_and_extract(pretrained_models_dir, WEIGHTS_DOWNLOAD_INFO)
165 model_size_str = self.model_size.value if isinstance(self.model_size, EfficientAdModelSize) else self.model_size
166 teacher_path = (
/usr/local/lib/python3.10/dist-packages/anomalib/data/utils/download.py in download_and_extract(root, info)
309 # audit url. allowing only http:// or https://
310 if info.url.startswith("http://") or info.url.startswith("https://"):
--> 311 with DownloadProgressBar(unit="B", unit_scale=True, miniters=1, desc=info.name) as progress_bar:
312 # nosemgrep: python.lang.security.audit.dynamic-urllib-use-detected.dynamic-urllib-use-detected # noqa: ERA001, E501
313 urlretrieve( # noqa: S310 # nosec B310
/usr/local/lib/python3.10/dist-packages/anomalib/data/utils/download.py in __init__(self, iterable, desc, total, leave, file, ncols, mininterval, maxinterval, miniters, use_ascii, disable, unit, unit_scale, dynamic_ncols, smoothing, bar_format, initial, position, postfix, unit_divisor, write_bytes, lock_args, nrows, colour, delay, gui, **kwargs)
118 **kwargs,
119 ) -> None:
--> 120 super().__init__(
121 iterable=iterable,
122 desc=desc,
/usr/local/lib/python3.10/dist-packages/tqdm/std.py in __init__(self, iterable, desc, total, leave, file, ncols, mininterval, maxinterval, miniters, ascii, disable, unit, unit_scale, dynamic_ncols, smoothing, bar_format, initial, position, postfix, unit_divisor, write_bytes, lock_args, nrows, colour, delay, gui, **kwargs)
1096 self.sp = self.status_printer(self.fp)
1097 if delay <= 0:
-> 1098 self.refresh(lock_args=self.lock_args)
1099
1100 # Init the time counter
/usr/local/lib/python3.10/dist-packages/tqdm/std.py in refresh(self, nolock, lock_args)
1345 else:
1346 self._lock.acquire()
-> 1347 self.display()
1348 if not nolock:
1349 self._lock.release()
/usr/local/lib/python3.10/dist-packages/tqdm/std.py in display(self, msg, pos)
1493 if pos:
1494 self.moveto(pos)
-> 1495 self.sp(self.__str__() if msg is None else msg)
1496 if pos:
1497 self.moveto(-pos)
/usr/local/lib/python3.10/dist-packages/tqdm/std.py in print_status(s)
457 def print_status(s):
458 len_s = disp_len(s)
--> 459 fp_write('\r' + s + (' ' * max(last_len[0] - len_s, 0)))
460 last_len[0] = len_s
461
/usr/local/lib/python3.10/dist-packages/tqdm/std.py in fp_write(s)
451 def fp_write(s):
452 fp.write(str(s))
--> 453 fp_flush()
454
455 last_len = [0]
/usr/local/lib/python3.10/dist-packages/tqdm/utils.py in inner(*args, **kwargs)
194 def inner(*args, **kwargs):
195 try:
--> 196 return func(*args, **kwargs)
197 except OSError as e:
198 if e.errno != 5:
/usr/local/lib/python3.10/dist-packages/rich/file_proxy.py in flush(self)
51 output = "".join(self.__buffer)
52 if output:
---> 53 self.__console.print(output)
54 del self.__buffer[:]
55
/usr/local/lib/python3.10/dist-packages/rich/console.py in print(self, sep, end, style, justify, overflow, no_wrap, emoji, markup, highlight, width, height, crop, soft_wrap, new_line_start, *objects)
1676 crop = False
1677 render_hooks = self._render_hooks[:]
-> 1678 with self:
1679 renderables = self._collect_renderables(
1680 objects,
/usr/local/lib/python3.10/dist-packages/rich/console.py in __exit__(self, exc_type, exc_value, traceback)
862 def __exit__(self, exc_type: Any, exc_value: Any, traceback: Any) -> None:
863 """Exit buffer context."""
--> 864 self._exit_buffer()
865
866 def begin_capture(self) -> None:
/usr/local/lib/python3.10/dist-packages/rich/console.py in _exit_buffer(self)
820 """Leave buffer context, and render content if required."""
821 self._buffer_index -= 1
--> 822 self._check_buffer()
823
824 def set_live(self, live: "Live") -> None:
/usr/local/lib/python3.10/dist-packages/rich/console.py in _check_buffer(self)
2017
2018 try:
-> 2019 self._write_buffer()
2020 except BrokenPipeError:
2021 self.on_broken_pipe()
/usr/local/lib/python3.10/dist-packages/rich/console.py in _write_buffer(self)
2033 from .jupyter import display
2034
-> 2035 display(self._buffer, self._render_buffer(self._buffer[:]))
2036 del self._buffer[:]
2037 else:
/usr/local/lib/python3.10/dist-packages/rich/jupyter.py in display(segments, text)
89 from IPython.display import display as ipython_display
90
---> 91 ipython_display(jupyter_renderable)
92 except ModuleNotFoundError:
93 # Handle the case where the Console has force_jupyter=True,
/usr/local/lib/python3.10/dist-packages/IPython/core/display.py in display(include, exclude, metadata, transient, display_id, *objs, **kwargs)
325 # kwarg-specified metadata gets precedence
326 _merge(md_dict, metadata)
--> 327 publish_display_data(data=format_dict, metadata=md_dict, **kwargs)
328 if display_id:
329 return DisplayHandle(display_id)
/usr/local/lib/python3.10/dist-packages/IPython/core/display.py in publish_display_data(data, metadata, source, transient, **kwargs)
117 kwargs['transient'] = transient
118
--> 119 display_pub.publish(
120 data=data,
121 metadata=metadata,
/usr/local/lib/python3.10/dist-packages/ipykernel/zmqshell.py in publish(self, data, metadata, source, transient, update)
113 If True, send an update_display_data message instead of display_data.
114 """
--> 115 self._flush_streams()
116 if metadata is None:
117 metadata = {}
/usr/local/lib/python3.10/dist-packages/ipykernel/zmqshell.py in _flush_streams(self)
81 """flush IO Streams prior to display"""
82 sys.stdout.flush()
---> 83 sys.stderr.flush()
84
85 @default('_thread_local')
... last 11 frames repeated, from the frame below ...
/usr/local/lib/python3.10/dist-packages/rich/file_proxy.py in flush(self)
51 output = "".join(self.__buffer)
52 if output:
---> 53 self.__console.print(output)
54 del self.__buffer[:]
55
RecursionError: maximum recursion depth exceeded while calling a Python objectCode of Conduct
- I agree to follow this project's Code of Conduct
Metadata
Metadata
Assignees
Labels
No labels