Yes, this is the expected result from a regularized model. If you set alpha
(the regularizer weight) to a lower value such as .01
, you'll see that the coefficients are allowed to grow, in turn causing the intercept to shrink:
In [12]: for l1_ratio in [0.01, 0.05, 0.25, 0.5, 0.75, 0.95, 0.99]:
enet = ElasticNet(l1_ratio=l1_ratio, alpha=.01).fit(X, y)
print "ENet", l1_ratio, ":", enet.intercept_, enet.coef_
....:
ENet 0.01 : 0.061675959472 [ 0.86445434]
ENet 0.05 : 0.0620121787424 [ 0.86371543]
ENet 0.25 : 0.0637498016326 [ 0.85989664]
ENet 0.5 : 0.066063739564 [ 0.85481129]
ENet 0.75 : 0.0685519831348 [ 0.84934286]
ENet 0.95 : 0.0706817244743 [ 0.84466231]
ENet 0.99 : 0.0711236518251 [ 0.84369108]
The intercept is never regularized.