I have being learning about deep neural networks and how the increase in hidden layers give better results. but the problem that i found was we usually get rid of loops in calculations by using matrices. Is there any way to remove the for loop which loop though the layers in forward and backward propagation?

有帮助吗?

解决方案

I assume that what you actually want to do is calculate the layers in parallel (at the same time). In general that is not possible because in forward propagation you need the output of the previous layer(s) to calculate a given layer. For backpropagation the dependency is reversed.

But when you look at typical network architectures, there are not more than a few hundred layers, typically much less. On the other hand each layer can have many thousands of weights.

Therefore you can save much more time by running the calculations within each layer in parallel and that is not an issue because they are independent from each other.

许可以下: CC-BY-SA归因
scroll top