Domanda

I have a neural network that starts to overfit in that the validation loss begins to increase while the training loss stays ~ flat with epochs.

Is there a generic algorithm - obvious or otherwise, well-known or not - to stop the training early if overfitting is somehow detected?

I note that catboost implements such an algorithm but I have found it nowhere else.

https://catboost.ai/docs/concepts/overfitting-detector.html

Is this all simply a matter of rolling my own callback function and stopping when the training and validation losses start to diverge..?

Preference for TF, Keras, python3, ...

Thanks as ever

È stato utile?

Soluzione

Sounds like you're just looking for EarlyStopping, which will stop training when validation loss does not improve for N epochs. It's the same as Iter in catboost.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
scroll top