Gradients should stay on Path: Better Estimators of the Reverse- and Forward KL Divergence for Normalizing Flows.
Lorenz VaitlKim A. NicoliShinichi NakajimaPan KesselPublished in: CoRR (2022)
Keyphrases
- kl divergence
- kullback leibler
- kullback leibler divergence
- mahalanobis distance
- information theoretic
- gaussian mixture
- gaussian distribution
- posterior distribution
- exponential family
- mutual information
- similarity measure
- principal component analysis
- statistical model
- dissimilarity measure
- probabilistic latent semantic analysis