문제

I searched through the internet a lot but could not come to the conclusion why do we use weights in each layer of backpropagation algorithm. i know that the weights are multiplied to the output of previous layer to get the input of the next layer, but i do not understand why do we need these weights? Please help Thanks Ark

도움이 되었습니까?

해결책

Without the weights there could be no learning. Weigths are the values which are adjusted during the backpropagation learning process. A neural network is nothing more than a function and the weights parametrize the behavior of this function.

To better understand first look at a single layer perceptron such as the ADALINE.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top