Question

I want to predict a multivariate time series. My time series is $a_1(t),...,a_k(t)$ and I want to predict $a_k(t)$. I use the following keras LSTM:

model = Sequential()
model.add(LSTM(90,return_sequences=True,input_shape=(train_X.shape[1], train_X.shape[2])))
model.add(LSTM(90))
model.add(Dense(1))

I use $a_1(t),...,a_{k-1}(t)$ as input and $a_k(t)$ as output to train it. So I don't use delayed inputs like $a_s(t-l)$. My qestion is in this situation, Does LSTM work as a deep neural network? i.e. Is it same as the following keras net?

model.add(Dense(90, input_dim=12))
model.add(Dense(90))
model.add(Dense(1))
Was it helpful?

Solution

LSTMs are RNN with memory cell
It can be difficult to train standard RNNs to solve problems that require learning long-term temporal dependencies.
LSTM units include a 'memory cell' that can maintain information in memory for long periods of time. A set of gates is used to control when information enters the memory, when it's output, and when it's forgotten. This architecture lets them learn longer-term dependencies.

So without the delayed inputs a LSTM is simply a RNN.

As you can see here, RNN has a recurrent connection on the hidden state. This looping constraint ensures that sequential information is captured in the input data.

As you can see here, RNN has a recurrent connection on the hidden state. This looping constraint ensures that sequential information is captured in the input data.

Check this for more details

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top