Question

Whenever I train a neural network I only have it go through a few epochs ( 1 to 3). This is because I am training them on a bad CPU and it would take some time to have the neural network go though many epochs.

However, whenever my neural network performs poorly, rather than have it go through more epochs, I try to optimize the hyperparameters. This approach has generally been successful as my neural networks are pretty simple.

But is training a neural network in this manner a bad practice? Are there disadvantages to immediately going to optimize the hyperparameters rather than running the neural network for more epochs?

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top