Domanda

I'm trying to get started with neural network and implement boolean functions like AND/OR. Instead of using 0, and 1 as binary inputs, they use -1 and +1. Is there a reason why we cannot use (0, 1)? As an example: http://www.youtube.com/watch?v=Ih5Mr93E-2c

È stato utile?

Soluzione 2

In most cases there's no difference. Just use logistic function activation instead of tanh. In some special forms, e.g., Ising model, it could nontrivially change the parameter space though.

Altri suggerimenti

If you really mean inputs, there is no restriction on using {-1,1}. You can just as easily use {0,1} or any other pair of real numbers (e.g., {6,42}) to define your True/False input values.

What may be confusing you in the charts is that {-1,1} are used as the outputs of the neurons. The reason for that, as @Memming stated, is because of the activation function used for the neuron. If tanh is used for the activation function, the neuron's output will be in the range (-1,1), whereas if you use a logistic function, its output will be in the range (0,1). Either will work for a multi-layer perceptron - you just have to define your target value (expected output) accordingly.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top