Sharp Bounds on the Approximation Rates, Metric Entropy, and n-Widths of Shallow Neural Networks.
Jonathan W. SiegelJinchao XuPublished in: Found. Comput. Math. (2024)
Keyphrases
- neural network
- error bounds
- expected error
- error tolerance
- information theory
- stage stochastic programs
- approximation methods
- pattern recognition
- upper bound
- rate distortion theory
- lower bound
- back propagation
- upper and lower bounds
- minimum error
- information geometry
- neural network model
- question answering
- worst case
- expected loss
- natural language processing
- fuzzy logic
- information content
- genetic algorithm
- normalized mutual information
- fault diagnosis
- approximation error
- information entropy
- fisher information
- approximation algorithms
- mutual information
- error estimates
- shannon entropy
- perceptual image quality
- constant factor
- neural nets
- closed form
- distance metric
- artificial neural networks
- fuzzy systems
- approximation guarantees
- metric space
- information redundancy
- self organizing maps
- image quality
- np hard
- high quality
- learning algorithm
- information retrieval