Sampling from Gaussian Process Posteriors using Stochastic Gradient Descent.
Jihao Andreas LinJavier AntoránShreyas PadhyDavid JanzJosé Miguel Hernández-LobatoAlexander TereninPublished in: NeurIPS (2023)
Keyphrases
- gaussian process
- stochastic gradient descent
- importance sampling
- approximate inference
- posterior distribution
- hyperparameters
- latent variables
- gaussian processes
- random sampling
- least squares
- loss function
- matrix factorization
- markov chain monte carlo
- step size
- bayesian framework
- model selection
- regression model
- sample size
- bayesian inference
- random forests
- semi supervised
- markov chain
- multiple kernel learning
- support vector machine
- parameter space
- maximum a posteriori
- prior knowledge
- state space
- collaborative filtering
- expectation maximization
- em algorithm
- parameter estimation
- posterior probability
- bayesian networks