Online and Scalable Model Selection with Multi-Armed Bandits.
Jiayi XieMichael TashmanJohn HoffmanLee WinikorRouzbeh GeramiPublished in: CoRR (2021)
Keyphrases
- model selection
- multi armed bandits
- cross validation
- hyperparameters
- sample size
- parameter estimation
- online learning
- regression model
- machine learning
- selection criterion
- feature selection
- gaussian process
- generalization error
- mixture model
- model selection criteria
- variable selection
- marginal likelihood
- bandit problems
- automatic model selection
- bayesian information criterion
- information criterion
- statistical inference
- data mining
- parameter determination
- training data
- bayesian framework
- optimal solution