Stability & Generalisation of Gradient Descent for Shallow Neural Networks without the Neural Tangent Kernel.
Dominic RichardsIlja KuzborskijPublished in: CoRR (2021)
Keyphrases
- neural network
- learning rules
- network architecture
- lyapunov function
- neural model
- back propagation
- cost function
- fuzzy logic
- tangent distance
- neural learning
- artificial neural networks
- pattern recognition
- kernel function
- stability analysis
- training algorithm
- associative memory
- fault diagnosis
- neural network model
- neural computation
- recurrent neural networks
- support vector
- loss function
- feed forward
- neural fuzzy
- multilayer perceptron
- gaussian processes
- genetic algorithm
- self organizing maps
- reproducing kernel hilbert space
- kernel methods
- deep knowledge
- hopfield neural network
- connectionist models
- hebbian learning
- feature space
- lyapunov theory
- radial basis function
- fuzzy systems
- spiking neurons
- asymptotic stability
- question answering
- error function
- activation function
- multi layer perceptron
- bio inspired