At Stability's Edge: How to Adjust Hyperparameters to Preserve Minima Selection in Asynchronous Training of Neural Networks?
Niv GiladiMor Shpigel NacsonElad HofferDaniel SoudryPublished in: ICLR (2020)
Keyphrases
- hyperparameters
- neural network
- model selection
- training process
- cross validation
- closed form
- support vector
- random sampling
- bayesian framework
- prior information
- gaussian processes
- bayesian inference
- grid search
- regularization parameter
- gaussian process
- em algorithm
- noise level
- training set
- genetic algorithm
- maximum a posteriori
- incomplete data
- sample size
- edge detection
- supervised learning
- artificial neural networks
- data sets
- incremental learning
- support vector machine
- image segmentation
- missing values
- parameter settings
- e learning
- maximum likelihood
- learning algorithm
- upper bound
- search space