Question

I am new to deep learning. I am working on training an SSD model on a set of small objects. I am using Adam gradient descent for optimization and a large input (800x800), but I seem to only get an improvement of 0.010 after every 20 or so epochs(350 steps).

What can I do or look for to speed up convergence on this model?

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top