Matlab's fminunc
doesn't actually use gradient descent, but rather Newton-like methods (BFGS-based quasi-Newton or trust-region depending on the problem size), which are in general significantly faster than gradient descent, no matter how you choose the step size.
Maybe you should look into this kind of methods if you want faster convergence.