문제

I have a neural network that starts to overfit in that the validation loss begins to increase while the training loss stays ~ flat with epochs.

Is there a generic algorithm - obvious or otherwise, well-known or not - to stop the training early if overfitting is somehow detected?

I note that catboost implements such an algorithm but I have found it nowhere else.

https://catboost.ai/docs/concepts/overfitting-detector.html

Is this all simply a matter of rolling my own callback function and stopping when the training and validation losses start to diverge..?

Preference for TF, Keras, python3, ...

Thanks as ever

도움이 되었습니까?

해결책

Sounds like you're just looking for EarlyStopping, which will stop training when validation loss does not improve for N epochs. It's the same as Iter in catboost.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 datascience.stackexchange
scroll top