Lower bounds for artificial neural network approximations: A proof that shallow neural networks fail to overcome the curse of dimensionality.
Philipp GrohsShokhrukh IbragimovArnulf JentzenSarah KoppensteinerPublished in: CoRR (2021)
Keyphrases
- artificial neural networks
- neural network
- lower bound
- back propagation
- upper bound
- neural network model
- multilayer perceptron
- feed forward
- objective function
- activation function
- branch and bound algorithm
- genetic algorithm
- recurrent neural networks
- worst case
- question answering
- branch and bound
- backpropagation neural networks
- natural language processing
- closed form
- backpropagation neural network
- upper and lower bounds
- input variables
- learning rules
- neural nets
- bp neural network
- linear logic
- np hard
- genetic algorithm ga
- fuzzy logic
- ann models
- lower and upper bounds
- theorem prover
- multi layer perceptron
- pattern recognition
- information extraction
- optimal solution
- vc dimension
- radial basis function
- theorem proving
- multi layer
- data structure
- feedforward neural networks
- fault diagnosis
- considerable increase
- heavy traffic limit