Batch Size Inference Warning during UniNet Training. #3107
-
|
When training UniNet with MVTecAD (category: bottle), I'm encountering a warning about batch size inference even though train_batch_size=8 and eval_batch_size=8 are explicitly configured in the data module: Reproduction Code: Do I need to follow the prompts to fix this warning? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
|
Hello @1999swallowthemoon This warning is quite frequent but first and foremost it is neither a big problem nor something that you should fix. It originates from the model implementation as the Lighting Trainer does not receive a clear value for the batch size, so it 'guesses'. I've only seen it guess correctly however this warning could be part of a pull request in the future. |
Beta Was this translation helpful? Give feedback.
Hello @1999swallowthemoon
This warning is quite frequent but first and foremost it is neither a big problem nor something that you should fix.
It originates from the model implementation as the Lighting Trainer does not receive a clear value for the batch size, so it 'guesses'. I've only seen it guess correctly however this warning could be part of a pull request in the future.