Question

I searched many times in google for examples on word embedding (specifically GloVe) with Random forest and I couldn't find any single example. For GloVe, it was all either LSTM or CNN. Maybe there's something I don't understand, but why only these 2 algorithms are used with GloVe, what's wrong with the other algorithms?

Was it helpful?

Solution

Usually these are highly dimensional matrices, and all of the information can not be modeled by less-complex models like RF and SVM.

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top