When do you stop training a Neural Network?
1. Early cut off looking at training_loss 2. Early cut off looking at validation_loss 3. Set Higher Regularization? Increasing data is not an option, have used SMOTE related technqiues for interpolating more data for the imbalanced classes already. Just trying to understand what do other folks do.
This is an old project. Was looking through Tensorboard log for a run I did in 2021, during the pandemic.
Ideally you should look at validation loss as that is the true unseen data and would be closest to new unseen real world data.
> When do you stop training a Neural Network
When it gains sentience
Discover More
Curated from across