Hello,
I noticed that max_steps is often set to 1000 in many examples and default settings.
I am currently working with a dataset containing 20,000 time series, each with a length of 400 timestamps. I have a few questions regarding the training configuration:
max_steps vs n_epochs: Could you explain why max_steps is preferred over n_epochs in NeuralForecast?
Hyperparameter adjustment: If I set batch_size=1024, given the size of my dataset, the model might not fully traverse the data (or complete sufficient epochs) within 1000 steps. In this scenario, is it recommended to increase max_steps?
Validation Loss: Is there a built-in method or attribute to access the validation loss curve during or after the training process?
Thanks for your help!