Question

I made a classification LSTM model and it seemed to be working as intended until I increased the number of epochs. That's when I noticed that the Validation and Training curves cross each other at a certain point:

Learning Curve

I have never seen this kind of curve... also couldn't find anything like it on my searches.

Does it mean my model is over-fitting?

Should I worry with this behavior?

What can I do to avoid this?

Thanks.

Was it helpful?

Solution

Yes, this is overfitting. As you can see in your loss curve the training loss is steadily decreasing like it should, but at the same time your validiation loss is increasing. On data the model hasn't seen before it's actually getting worse!

This means your model is starting to learn the training data, and does not instead keep the learning generalized. As soon as validation loss is increasing you should stop training.

I'm answering this question although overfitting is a general concept and has been answered before because this question might want to have an answer that is specific to LSTMs.

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top