質問

I have a toy dataset of one independent variable x and one dependent variable y=x. Linear regression can find the right intercept, 0, and coefficient, 1. But the elastic net always gives a non-zero intercept, and usually coefficients 0. I know it's regularising, so it wants smaller coefficients. But is this an expected result? Code below for Python/Scikit-learn.

#!/usr/bin/env python
import numpy as np
from sklearn.linear_model import ElasticNet, LinearRegression

X = np.zeros((10, 1))
X[:, 0] = np.random.random(10)
y = X[:, 0]

lr = LinearRegression().fit(X, y)
print "LR:        ", lr.intercept_, lr.coef_
for l1_ratio in [0.01, 0.05, 0.25, 0.5, 0.75, 0.95, 0.99]:
    enet = ElasticNet(l1_ratio=l1_ratio).fit(X, y)
    print "ENet", l1_ratio, ":", enet.intercept_, enet.coef_

EDIT: I was previously asking about regressing y=x^2. But this is a simpler and more surprising result.

役に立ちましたか?

解決

Yes, this is the expected result from a regularized model. If you set alpha (the regularizer weight) to a lower value such as .01, you'll see that the coefficients are allowed to grow, in turn causing the intercept to shrink:

In [12]: for l1_ratio in [0.01, 0.05, 0.25, 0.5, 0.75, 0.95, 0.99]:
        enet = ElasticNet(l1_ratio=l1_ratio, alpha=.01).fit(X, y)
        print "ENet", l1_ratio, ":", enet.intercept_, enet.coef_
   ....:     
ENet 0.01 : 0.061675959472 [ 0.86445434]
ENet 0.05 : 0.0620121787424 [ 0.86371543]
ENet 0.25 : 0.0637498016326 [ 0.85989664]
ENet 0.5 : 0.066063739564 [ 0.85481129]
ENet 0.75 : 0.0685519831348 [ 0.84934286]
ENet 0.95 : 0.0706817244743 [ 0.84466231]
ENet 0.99 : 0.0711236518251 [ 0.84369108]

The intercept is never regularized.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top