Question

If you're assigning random values to the weights in a neural network before back-propagation, is there a certain maximum or minimum value for each weight ( for example, 0 < w < 1000 ) or can weights take on any value? Could a network potentially have weights of 0.1, 0.0009, and 100000?

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top