Regularized KL-Divergence for Well-Defined Function-Space Variational Inference in Bayesian neural networks.
Tristan CinquinRobert BamlerPublished in: CoRR (2024)
Keyphrases
- posterior distribution
- variational inference
- kl divergence
- exponential family
- probability distribution
- latent variables
- parameter estimation
- kullback leibler
- bayesian inference
- bayesian framework
- hyperparameters
- posterior probability
- markov chain monte carlo
- maximum a posteriori
- maximum likelihood
- statistical models
- closed form
- probabilistic model
- density estimation
- graphical models
- gaussian distribution
- least squares
- mixture model
- kullback leibler divergence
- bayesian networks
- variational methods
- log likelihood
- hidden variables
- parameter space
- probability density function
- statistical model
- cross validation
- support vector
- missing values
- generative model
- order statistics
- em algorithm
- expectation maximization
- prior knowledge