Question

When I run something like

import numpy
from sklearn import linear_model
A= #something
b= #something
clf=linear_model.Lasso(alpha=0.015, fit_intercept=False, tol=0.00000000000001,
          max_iter=10000000000000, positive=True)
clf.fit(A,b)

I get the error:

usr/local/lib/python2.7/dist-packages/scikit_learn-0.14.1-py2.7-linux-x86_64.egg/
sklearn/linear_model/coordinate_descent.py:418: UserWarning: Objective did not
converge. You might want to increase the number of iterations
' to increase the number of iterations')

The interesting thing is that A is never rank defficient. (I think)

Was it helpful?

Solution

Try increasing tol.

From the documentation:

tol : float, optional

The tolerance for the optimization: if the updates are smaller than tol, the optimization code checks the dual gap for optimality and continues until it is smaller than tol.

The default for tol is 0.0001 on my version of scikit-learn. I assume that your tolerance is so small that the optimization never reaches a lower value.

OTHER TIPS

The only thing that SOMETIMES helped me to get rid of the warning was increasing the number of iterations significantly (with a significant increase of training time).

Increasing the tolerance always led to the same warnings, but with larger values in them, and not to getting rid of the warnings. Not sure why.

As an important analytical side note, I interpret getting this warning initially when using Lasso regression as a bad sign, regardless of what happens next.
For me it practically always occurred in the situation when the model was over-fitting, meaning that it performed well on the full training set itself, but then poorly during cross-validation and testing.
Regardless of whether I had supressed the warning (there is a way) or had gotten rid of it "naturally" by increasing the number of iterations, I almost always had to go back and simplify the set of features for Lasso to be effective (and in some cases to abandon Lasso altogether in favor of a different model).

I've got the same problem. Depending on the data variety, using the option normalize=True also help to make the model converge.

change some default in lasso regression:

from sklearn.linear_model import Lasso
Lasso(normalize=True, tol=1e-2)

and another solution is turning off warnings :))

import warnings
warnings.filterwarnings('ignore')
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top