At Stability's Edge: How to Adjust Hyperparameters to Preserve Minima Selection in Asynchronous Training of Neural Networks?
Niv GiladiMor Shpigel NacsonElad HofferDaniel SoudryPublished in: CoRR (2019)
Keyphrases
- hyperparameters
- neural network
- training process
- model selection
- cross validation
- bayesian inference
- bayesian framework
- prior information
- closed form
- support vector
- maximum likelihood
- grid search
- gaussian process
- random sampling
- maximum a posteriori
- edge detection
- sample size
- training set
- back propagation
- noise level
- artificial neural networks
- em algorithm
- incremental learning
- gaussian processes
- genetic algorithm
- regularization parameter
- data sets
- incomplete data
- random forest
- feature vectors
- worst case
- bp neural network