Gradients should stay on path: better estimators of the reverse- and forward KL divergence for normalizing flows.
Lorenz VaitlKim A. NicoliShinichi NakajimaPan KesselPublished in: Mach. Learn. Sci. Technol. (2022)
Keyphrases
- kl divergence
- kullback leibler
- kullback leibler divergence
- information theoretic
- mahalanobis distance
- gaussian distribution
- posterior distribution
- exponential family
- gaussian mixture
- mutual information
- probability density
- information theory
- probabilistic latent semantic analysis
- closed form
- maximum likelihood
- denoising
- clustering method
- translation model