Domanda

I searched many times in google for examples on word embedding (specifically GloVe) with Random forest and I couldn't find any single example. For GloVe, it was all either LSTM or CNN. Maybe there's something I don't understand, but why only these 2 algorithms are used with GloVe, what's wrong with the other algorithms?

È stato utile?

Soluzione

Usually these are highly dimensional matrices, and all of the information can not be modeled by less-complex models like RF and SVM.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
scroll top