Question

I am participating in a kaggle competition. I am planning to use the XGBoost package (in R). I read the XGBoost documentation and understood the basics.

Can someone explain how is feature engineering done using XGBoost?

An example for explanation would be of great help.

Was it helpful?

Solution

It turns out that the question I asked is incorrect.

Initially feature engineering is done, then xgboost is used to build a model out of it. If we not satisfied with the model's performance we can go back to feature engineering.

Thanks Dan Levin for the explanation.

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top