Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Vanilla Variational Auto Encoder #71

Open
markusMM opened this issue Feb 22, 2019 · 0 comments
Open

Vanilla Variational Auto Encoder #71

markusMM opened this issue Feb 22, 2019 · 0 comments

Comments

@markusMM
Copy link

Hello WiseOdd,

I have recognise you vanilla VAE, which seems pretty neat despite of the fact that does not work as I remember a Gassian noise sparse model would work.

I have recently read 1606.05908 where VAE are explained quite good more or less.
Threre is the PDF for X described as expectation value over $P(Z)$. Now, when you update your parameters of the Gaussian, normally the variance is a diagonal matrix of $\sigma^2$.
So, of cause one could think about a variance for each point in the dimensionality of X, but I am not qute sure that is the proper common idea behind the variance of the Gaussian distribution in generative models.

$P(X) = \frac{1}{D_z} \sum_{\forall z \in Z} P(X|Z, \Theta) P(Z)$ using this annotation, the expectation for $\sigma^2$ based on the data would be something similar to $\sum_{\forall n \in N} \braket{X-\mu(z;\Theta)|X-\mu(z;\Theta)}$, right?

Maybe you can take a look on that code. Because your variance seems to appear as a matrix instead of a value or set of values for the Gaussian noise model.

So, did you thought explicitly about a full covariance matrix or was it just trial and error in this case?

If trying out expectation value(s) for the variance, let me know, what your experience is.

regards,
Markus

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant