문제

I searched many times in google for examples on word embedding (specifically GloVe) with Random forest and I couldn't find any single example. For GloVe, it was all either LSTM or CNN. Maybe there's something I don't understand, but why only these 2 algorithms are used with GloVe, what's wrong with the other algorithms?

도움이 되었습니까?

해결책

Usually these are highly dimensional matrices, and all of the information can not be modeled by less-complex models like RF and SVM.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 datascience.stackexchange
scroll top