I searched many times in google for examples on word embedding (specifically GloVe) with Random forest and I couldn't find any single example. For GloVe, it was all either LSTM or CNN. Maybe there's something I don't understand, but why only these 2 algorithms are used with GloVe, what's wrong with the other algorithms?

有帮助吗?

解决方案

Usually these are highly dimensional matrices, and all of the information can not be modeled by less-complex models like RF and SVM.

许可以下: CC-BY-SA归因
scroll top