Question

I am trying to train a neural network for recognizing handwritten letters from A to J . I have a training set of size 200000 . Each training set is a list of 784 pixel values. My neural net has input layer of size 784 , hidden layer of size 50 and output layer of size 10.

I am using fmin_cg minimization function of scipy library of python. The problem I am facing is that each iteration is taking a lot of time.

  • The first iteration took almost 7-10 minutes.
  • The second iteration took 20 minutes.
  • Third is still running.

This might be due to my outdated computer with only 2 gb of memory and a slow processor but I have previously trained a neural net with the training set of size 5000 , input layer size if 400 , hidden layer size 25 and the output layer of size 10 . This neural net recognized handwritten digits and it was an exercise problem of coursera course on machine learning by Andrew Ng .

So yes I know that the current neural network should take more time to train than the previous one as the training set , input layer, and hidden layer are all much larger than previous neural net but still, I think it's taking a lot of time . Why is it so slow ?

Is it normal for the neural network of this size ? Or should I use other faster optimization algorithm ? Is there a way to measure time complexity of neural networks ?

Was it helpful?

Solution

Neural networks are best trained with stochastic gradient descent (with minibatches), not with conjugate gradient descent or other optimization methods.

I recommend using a framework designed for this, like Keras or Tensorflow.

Separately: I recommend using a convolutional network for this particular task, not a fully connected network.

OTHER TIPS

The first paragraph in this tutorial, explains it pretty well.

Also the time complexity will not grow linearly if you increase the size of your network. It will grow squared and so will the number of parameters.

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top