You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Excuse me again!
When I use the KLDivLoss as the criterion in ./block/models/criterions/kl_divergence.py rather than crossentropy, I get the loss is 0 and the acc is worse , as following picture shows:
But in paper MFH ,KLD loss is better than CrossEntropy, So, how could this be? Thanks!
The text was updated successfully, but these errors were encountered:
@haoopan sorry we don't provide support for KLD loss.
We didn't manage to make it work.
If I remember correctly, even with this low loss it can learn, but not as well as standard cross entropy.
There might be a bug somewhere.
Did you fix i maybe???
Excuse me again!
![image](https://user-images.githubusercontent.com/35628413/63181798-8a985a80-c083-11e9-8975-8b520c841d72.png)
When I use the KLDivLoss as the criterion in ./block/models/criterions/kl_divergence.py rather than crossentropy, I get the loss is 0 and the acc is worse , as following picture shows:
But in paper MFH ,KLD loss is better than CrossEntropy, So, how could this be? Thanks!
The text was updated successfully, but these errors were encountered: