Without the weights there could be no learning. Weigths are the values which are adjusted during the backpropagation learning process. A neural network is nothing more than a function and the weights parametrize the behavior of this function.
To better understand first look at a single layer perceptron such as the ADALINE.