Domanda

I searched through the internet a lot but could not come to the conclusion why do we use weights in each layer of backpropagation algorithm. i know that the weights are multiplied to the output of previous layer to get the input of the next layer, but i do not understand why do we need these weights? Please help Thanks Ark

È stato utile?

Soluzione

Without the weights there could be no learning. Weigths are the values which are adjusted during the backpropagation learning process. A neural network is nothing more than a function and the weights parametrize the behavior of this function.

To better understand first look at a single layer perceptron such as the ADALINE.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top