Вопрос

I am studying Perceptron Learning, and have a question which sort of leaves a bit confused. As I am self-teaching, I have looked through a variety of papers, tutorials, powerpoints etc., and at times it seems they use different algorithms to adjust the weights of the network.

For example, some include a learning rate, others include individual weight/input product while others just the sum of all weight/input products.

So, am I right in assuming that there are multiple algorithms which all lead to the same final weight matrix/vector?

Это было полезно?

Решение

Nope, not the same.

You are right that there are many algorithms, but they may lead to different weights. Its like sorting algorithms - there are many, each of them does the same thing, but some are stable and some are not, some use additional memory, and some sorts in place.

Другие советы

I wrote an article: An Intuitive Example of Artificial Neural Network (Perceptron) Detecting Vehicles and Pedestrians from the Camera of a Self-driven Car. I have tried to explain with simplest possible examples.

You can check, I hope that can help you to understand the weight updating in Perceptron. Here is the link.

https://www.spicelogic.com/Journal/Perceptron-Artificial-Neural-Networks-10

I also explained Learning Rate with examples.

[Example of Detecting Vehicle and Pedestrian]2

Лицензировано под: CC-BY-SA с атрибуция
Не связан с StackOverflow
scroll top