Stability & Generalisation of Gradient Descent for Shallow Neural Networks without the Neural Tangent Kernel.
Dominic RichardsIlja KuzborskijPublished in: NeurIPS (2021)
Keyphrases
- neural network
- learning rules
- network architecture
- cost function
- lyapunov function
- neural network model
- pattern recognition
- neural model
- support vector
- objective function
- neural fuzzy
- loss function
- artificial neural networks
- neural learning
- neural computation
- fault diagnosis
- question answering
- connectionist models
- back propagation
- associative memory
- feature space
- kernel methods
- multilayer perceptron
- kernel function
- activation function
- tangent distance
- neural models
- hebbian learning
- stability analysis
- fuzzy systems
- neural nets
- feed forward
- fuzzy logic
- information extraction
- fuzzy neural network
- feedforward neural networks
- reproducing kernel hilbert space
- gaussian processes
- stochastic gradient descent
- self organizing maps
- natural language processing
- asymptotic stability
- learning algorithm
- genetic algorithm