Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The question about KLD loss #11

Open
haoopan opened this issue Aug 16, 2019 · 1 comment
Open

The question about KLD loss #11

haoopan opened this issue Aug 16, 2019 · 1 comment

Comments

@haoopan
Copy link

haoopan commented Aug 16, 2019

Excuse me again!
When I use the KLDivLoss as the criterion in ./block/models/criterions/kl_divergence.py rather than crossentropy, I get the loss is 0 and the acc is worse , as following picture shows:
image
But in paper MFH ,KLD loss is better than CrossEntropy, So, how could this be? Thanks!

@Cadene
Copy link
Owner

Cadene commented Sep 12, 2019

@haoopan sorry we don't provide support for KLD loss.
We didn't manage to make it work.
If I remember correctly, even with this low loss it can learn, but not as well as standard cross entropy.
There might be a bug somewhere.
Did you fix i maybe???

Best

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants