Question

I searched through the internet a lot but could not come to the conclusion why do we use weights in each layer of backpropagation algorithm. i know that the weights are multiplied to the output of previous layer to get the input of the next layer, but i do not understand why do we need these weights? Please help Thanks Ark

Était-ce utile?

La solution

Without the weights there could be no learning. Weigths are the values which are adjusted during the backpropagation learning process. A neural network is nothing more than a function and the weights parametrize the behavior of this function.

To better understand first look at a single layer perceptron such as the ADALINE.

Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top