Skip to content

KL loss doesn't decrease #3

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
ilovecv opened this issue Sep 20, 2018 · 2 comments
Open

KL loss doesn't decrease #3

ilovecv opened this issue Sep 20, 2018 · 2 comments

Comments

@ilovecv
Copy link

ilovecv commented Sep 20, 2018

Hi,
Thanks for sharing the codes, nice work!
When I train the learned prior model on smmnist dataset, the reconstruction loss decreases;
However, the KL loss kept increasing. Is it normal?

Thanks.

@zyong812
Copy link

zyong812 commented Nov 8, 2019

I have the same problem. I think it is normal as long as the total loss is decreasing.
The reason why KL loss increases might be the initial KL loss is very small.

@DanielTakeshi
Copy link

Intuitively if the KL loss goes to 0, that seems problematic since otherwise the posterior isn't any different from the prior?

For reference, running SM-MNIST using default commands here for SVG-LP gives KL values of about 5-ish.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants