Question

Concerning the structure of a LSTM network

If I wanted to create LSTM network for solving time series predictions, how should I structure the hiddens layers of the neural network?

  • A LSTM memory block would represent a hidden layer and all the nodes in the layer would be represented by cells?
  • Each hidden layer should consist of numerous LSTM memory blocks, and a collection of such blocks will form a layer?



Graphical representation:


Either in this manner:

proposed solution 1

Or like this ?

proposed solution 2

Was it helpful?

Solution

After talking to some of the professors at my university, I finally got this sorted out.

You should view a LSTM block as a single neuron in your network.



Thus this network would be regarded as a neural network with a single hidden layer with two neurons: enter image description here

OTHER TIPS

As far as I know you'd need multiple LSTMs to form a layer (see this paper for example) but you should probably ask this on https://stats.stackexchange.com/

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top