문제

Concerning the structure of a LSTM network

If I wanted to create LSTM network for solving time series predictions, how should I structure the hiddens layers of the neural network?

  • A LSTM memory block would represent a hidden layer and all the nodes in the layer would be represented by cells?
  • Each hidden layer should consist of numerous LSTM memory blocks, and a collection of such blocks will form a layer?



Graphical representation:


Either in this manner:

proposed solution 1

Or like this ?

proposed solution 2

도움이 되었습니까?

해결책

After talking to some of the professors at my university, I finally got this sorted out.

You should view a LSTM block as a single neuron in your network.



Thus this network would be regarded as a neural network with a single hidden layer with two neurons: enter image description here

다른 팁

As far as I know you'd need multiple LSTMs to form a layer (see this paper for example) but you should probably ask this on https://stats.stackexchange.com/

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top