Loss function for class imbalanced binary classifier in Tensor flow

You can add class weights to the loss function, by multiplying logits. Regular cross entropy loss is this: loss(x, class) = -log(exp(x[class]) / (\sum_j exp(x[j]))) = -x[class] + log(\sum_j exp(x[j])) in weighted case: loss(x, class) = weights[class] * -x[class] + log(\sum_j exp(weights[class] * x[j])) So by multiplying logits, you are re-scaling predictions of each class … Read more