I searched through the internet a lot but could not come to the conclusion why do we use weights in each layer of backpropagation algorithm. i know that the weights are multiplied to the output of previous layer to get the input of the next layer, but i do not understand why do we need these weights? Please help Thanks Ark

有帮助吗?

解决方案

Without the weights there could be no learning. Weigths are the values which are adjusted during the backpropagation learning process. A neural network is nothing more than a function and the weights parametrize the behavior of this function.

To better understand first look at a single layer perceptron such as the ADALINE.

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top