Question

If I have 10,000 training samples then what should I do:

Bootstrapping and train 10 classifiers on it and then aggregating

Or

randomly divide the dataset into 10 parts and train 10 classifiers on them and then aggregating. Which will be better?

Will the 2nd method reduce variance and will it be better than 1st method

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top