문제

I have some data in which model inputs and outputs (which are the same size) belong to multiple classes concurrently. A single input or output is a vector of zeros somewhere between one and four values that are equal to 1:

[0 0 0 1 0 0 1 0 1 0 0]

These kinds of vectors are sometimes called "multi-hot embeddings".

I am looking for an appropriate loss function for outputs of this kind. Is there a published equation I should check out? Or should I implement a custom loss function? Any suggestions others can offer on this question would be greatly appreciated!

도움이 되었습니까?

해결책

You are talking about a multi label classification, which is a common type of problem. The most common choice of loss function is binary crossentropy There’s a tutorial here that might help: https://towardsdatascience.com/multi-label-image-classification-with-neural-network-keras-ddc1ab1afede

다른 팁

I have also been pondering on this question and trialled loss function on this sort of problem.

For these type of classification tasks, the loss function which seems most appropriate is Binary Cross Entropy Loss : https://towardsdatascience.com/understanding-binary-cross-entropy-log-loss-a-visual-explanation-a3ac6025181a

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 datascience.stackexchange
scroll top