Pergunta

I searched many times in google for examples on word embedding (specifically GloVe) with Random forest and I couldn't find any single example. For GloVe, it was all either LSTM or CNN. Maybe there's something I don't understand, but why only these 2 algorithms are used with GloVe, what's wrong with the other algorithms?

Foi útil?

Solução

Usually these are highly dimensional matrices, and all of the information can not be modeled by less-complex models like RF and SVM.

Licenciado em: CC-BY-SA com atribuição
scroll top