Вопрос

Im currently trying to learn about back propagation, and it's going forward, but theres one thing that keeps me scratching my head, and doesnt really seems to be answered in any of the videos or articles im looking at. I understand now, that based on my loss, the weigths of my network is updated. But what i dont understand is how this happens. lets say i have this exercise network with the following weigths:

W_1 = 1.2 - w_2 = 0.4 - W_3 = 1.0

Now i do some training, and lets say i have the loss o.8. Now when i use my loss to update my weights, what happens specifically to the weights? are something being added, subtracted maybe multiplied?

Thanks a lot

Это было полезно?

Решение

In short it is "Added" to previous value of the weight.

Here is the algo from tom mitchell's, in your case it shall be W_1 = w_1+delta,W_2 = w_2+delta, W_3 = w_3+delta.

enter image description here

Лицензировано под: CC-BY-SA с атрибуция
Не связан с datascience.stackexchange
scroll top