Question

I have a classifier network which chooses one of three classifications, and uses cross entropy loss as the loss function. If the proportions of training data are 100:10:5 for each classification, should I automatically set the weights to 1/100,1/10,1/5 ?

If not, what other issues are there to consider?

Was it helpful?

Solution

I'll refer to an answer for a similar topic. If you have enough data for classes 2 and 3, there's no reason to change your training scheme if you use standard metrics. The baseline should always be training without changing the weights, and if you see that the model does very bad on classes 2 and 3, you can change the training scheme. However, I have rarely seen it work better.

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top