سؤال

I searched many times in google for examples on word embedding (specifically GloVe) with Random forest and I couldn't find any single example. For GloVe, it was all either LSTM or CNN. Maybe there's something I don't understand, but why only these 2 algorithms are used with GloVe, what's wrong with the other algorithms?

هل كانت مفيدة؟

المحلول

Usually these are highly dimensional matrices, and all of the information can not be modeled by less-complex models like RF and SVM.

مرخصة بموجب: CC-BY-SA مع الإسناد
لا تنتمي إلى datascience.stackexchange
scroll top