Question

I have been searching for a while and I just can't find any indication. When people talk about iterations in algorithms like XGBoost or LightGBM, or Catboost, do they mean how many decision trees i.e. base learners will be built? I.e. XGboost m=100 means the algorithm will build a total of 100 base learners, each calculating and optimizing towards the residual value of the previous prediction?
Or is it more like 1 epoch in deep learning?

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top