Pergunta

Does the XGBClassifier method utilizes the two regularization terms reg_alpha and reg_lambda, or are they redundant and only utilized in the regression method (i.e., XGBRegressor)?

I think that some hyperparameters in XGBoost could have no effect on specific methods (e.g., scale_pos_weight in XGBRegressor).

Foi útil?

Solução

XGB uses the two kinds of regularization in both classification and regression; each leaf is a continuous score, these scores added together for the final prediction (of log-odds in the classification case), so penalizing the weights makes sense in either setting.
See also L1 & L2 Regularization in Light GBM

But yes, some hyperparameters (scale_pos_weight) seem to be vestigial:
What does xgb's scale_pos_weight parameter do for regression?
https://discuss.xgboost.ai/t/scale-pos-weight-for-regression/218/10

Licenciado em: CC-BY-SA com atribuição
scroll top