A Gaussian Process-Bayesian Bernoulli Mixture Model for Multi-Label Active Learning.
Weishi ShiDayou YuQi YuPublished in: NeurIPS (2021)
Keyphrases
- multi label
- mixture model
- gaussian process
- active learning
- gaussian processes
- model selection
- hyperparameters
- semi supervised
- maximum likelihood
- variational inference
- em algorithm
- posterior distribution
- generative model
- text categorization
- bayesian framework
- unsupervised learning
- regression model
- dirichlet process
- probabilistic model
- graph cuts
- random sampling
- image classification
- binary classification
- posterior probability
- approximate inference
- expectation maximization
- semi supervised learning
- latent variables
- text classification
- density estimation
- bayesian inference
- cross validation
- machine learning
- learning tasks
- training examples
- class labels
- unlabeled data
- supervised learning
- exponential family
- language model
- learning algorithm
- labeled data
- prior distribution
- maximum a posteriori
- multi task learning
- training set
- bayesian networks
- closed form
- pairwise
- markov chain monte carlo
- learning process
- naive bayes
- feature selection
- incremental learning