문제

If you're assigning random values to the weights in a neural network before back-propagation, is there a certain maximum or minimum value for each weight ( for example, 0 < w < 1000 ) or can weights take on any value? Could a network potentially have weights of 0.1, 0.0009, and 100000?

올바른 솔루션이 없습니다

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 datascience.stackexchange
scroll top