Question

I'm working with an LSTM network to predict the surface roughness due to biogrowth on ships. I've got a Network that fits the input data I have relatively well, the problem is when I'm using it to predict future roughness values. I've 3 case scenarios - one where the hull is not cleaned, second where the hull is cleaned and the third where the hull is repainted. Based on the inputs - the idea is the trend drops slightly when the hull is cleaned and considerably when it is repainted.

However when I use the same inputs to predict the roughness for the different cases, the outputs differ in the region where it is supposed to be the same, i.e it is predicting the trend to go upward in the case where the hull is cleaned, but when it is not cleaned it doesn't show this upward trend. I assume it is seeing the pattern and seeing that as there is a future cleaning - the trend must go up. But this isn't reality. I've tried swapping the LSTM for a basic NN but this still exists. Any help would be useful.

enter image description here

Was it helpful?

Solution

The problem was that I was using a MinMaxScaler which scaled the inputs differently for the different cases. Once I set them to the same scale, this issue was resolved.

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top