We were unable to load Disqus. If you are a moderator please see our troubleshooting guide.
Hey Vineet,
Good question! I thought the ground truth usually won't be a possibility, it should be a one-hot encoded vector. Take the 'panda bird dog' case, the prediction could be three float numbers representing the probabilities for each animal, but for ground truth of the image being predicted, it should be like [0 0 1] if it is a dog, right? So if that's the case, the cross-entropy loss won't equal to 0 most of the time. Does this address your confusion?
Hi!
I have a query.
if cross entropy is a loss function, it should be equal to 0 when the probabilites predicted are the actual probabilites shown?
For example, panda bird dog -> 0.2 0.1 0.7 .. If this is predicted and that is also ground truth, cross entropy wouldnt be equal to 0. it would be equal to entropy. Im confused, or i do not know what point am i missing. Thanks for the help in advance