문제

Im currently trying to learn about back propagation, and it's going forward, but theres one thing that keeps me scratching my head, and doesnt really seems to be answered in any of the videos or articles im looking at. I understand now, that based on my loss, the weigths of my network is updated. But what i dont understand is how this happens. lets say i have this exercise network with the following weigths:

W_1 = 1.2 - w_2 = 0.4 - W_3 = 1.0

Now i do some training, and lets say i have the loss o.8. Now when i use my loss to update my weights, what happens specifically to the weights? are something being added, subtracted maybe multiplied?

Thanks a lot

도움이 되었습니까?

해결책

In short it is "Added" to previous value of the weight.

Here is the algo from tom mitchell's, in your case it shall be W_1 = w_1+delta,W_2 = w_2+delta, W_3 = w_3+delta.

enter image description here

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 datascience.stackexchange
scroll top