Question

This question pertains to L1 & L2 regularization parameters in Light GBM. As per official documentation:

reg_alpha (float, optional (default=0.)) – L1 regularization term on weights.

reg_lambda (float, optional (default=0.)) – L2 regularization term on weights

I have seen data scientists using both of these parameters at the same time, ideally either you use L1 or L2 not both together.

While reading about tuning LGBM parameters I cam across one such case: Kaggle official GBDT Specification and Optimization Workshop in Paris where Instructors are ML experts. And these experts have used positive values of both L1 & L2 params in LGBM model. Link below (Ctrl+F 'search_spaces' to directly reach parameter grid in this long kernel)

http://www.kaggle.com/lucamassaron/kaggle-days-paris-gbdt-workshop

I have seen same in XGBoost implementations.

My question is why use both at the same time in LGBM/XGBoost.

Thanks.

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top