문제

Is there any problem if we use too many hidden layers in Neural Network? Can anyone simply describe what problems can occur if we have too many hidden layers.

도움이 되었습니까?

해결책

The most important problem is so called "vanishing gradient phenomen", it is easy to verify (both theoretically and practically) that it is impossible to efficiently train more than one hidden layer (assuming traditional backpropagation, without deep learning/neocognitron/convolutional network) as computed gradients/derivatives are more and more smoothed. The "responsibility" for the error is lost with each additional layer.

Overfitting (as @Floris incorrectly stated) is not the main probolem here, as the same problem would come from the number of hidden units (in fact it would occur more commonly by increasing number of units in one hidden layer than by increasing number of hidden layers).

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top