Question

I am trying to train a neural network with the lasagne module in Python. I do not want a fully connected network as defined by lasagne.layers.DenseLayer. Instead, I would like to fix some of the weight parameters to zero. Does anyone know how to do this?

The closest solution I have found is something like:

params = lasagne.layers.get_all_params(network, trainable=True)
layer1.params[layer1.W].remove("trainable")

However, this fixes the entire set of weight parameters to their initial values. How can I fix only a subset of these weights?

No correct solution

OTHER TIPS

I'm not sure what your intention is by setting the weights to zero. Have you looked at dropout layers?

l_hid1 = lasagne.layers.DenseLayer(num_units=200)

l_hid1_drop = lasagne.layers.DropoutLayer(l_hid1, p=0.5)

This should drop 50% of your data from the l_hid1 layer.

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top