문제

I'm using XGBoost for a binary classification problem. There is no negative label, only 1 and 0.

I tunned the hyperparameters using Bayesian optimization then tried to train the final model with the optimized hyperparameters.

Mdl_XGB = xgb.train(OptimizedParams, dtrain)

scores_train = Mdl_XGB.predict(dtrain)

scores_test = Mdl_XGB.predict(dtest)

My problem is that the predicted scores for both train and test sets include both negative values and numbers greater than one. The scores are between -0.23 and 1.13.

Shouldn't these scores present the probability of belonging to class 1 (positive class)?

도움이 되었습니까?

해결책

You have to set the option objective = binary:logistic to get probabilities between 0 and 1, otherwise you only get relative scores.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 datascience.stackexchange
scroll top