A mean-field theory of lazy training in two-layer neural nets: entropic regularization and controlled McKean-Vlasov dynamics.
Belinda TzenMaxim RaginskyPublished in: CoRR (2020)
Keyphrases
- neural nets
- multi layer
- single layer
- supervised training
- feed forward
- feed forward neural networks
- back propagation
- training algorithm
- neural network
- recurrent networks
- restricted boltzmann machine
- artificial neural networks
- early stopping
- hidden layer
- deep architectures
- training samples
- dynamic model
- training examples
- feedforward neural networks
- counter propagation
- artificial intelligence
- learning tasks
- support vector machine
- supervised learning
- statistical mechanics
- stochastic gradient descent
- training set
- data sets
- markov random field
- information theory