Domanda

I have some data in which model inputs and outputs (which are the same size) belong to multiple classes concurrently. A single input or output is a vector of zeros somewhere between one and four values that are equal to 1:

[0 0 0 1 0 0 1 0 1 0 0]

These kinds of vectors are sometimes called "multi-hot embeddings".

I am looking for an appropriate loss function for outputs of this kind. Is there a published equation I should check out? Or should I implement a custom loss function? Any suggestions others can offer on this question would be greatly appreciated!

È stato utile?

Soluzione

You are talking about a multi label classification, which is a common type of problem. The most common choice of loss function is binary crossentropy There’s a tutorial here that might help: https://towardsdatascience.com/multi-label-image-classification-with-neural-network-keras-ddc1ab1afede

Altri suggerimenti

I have also been pondering on this question and trialled loss function on this sort of problem.

For these type of classification tasks, the loss function which seems most appropriate is Binary Cross Entropy Loss : https://towardsdatascience.com/understanding-binary-cross-entropy-log-loss-a-visual-explanation-a3ac6025181a

Autorizzato sotto: CC-BY-SA insieme a attribuzione
scroll top