A Universal Approximation Theorem of Deep Neural Networks for Expressing Probability Distributions.
Yulong LuJianfeng LuPublished in: NeurIPS (2020)
Keyphrases
- probability distribution
- neural network
- pattern recognition
- artificial neural networks
- multi layer
- random variables
- back propagation
- genetic algorithm
- fuzzy logic
- approximation algorithms
- neural nets
- knowledge representation
- network architecture
- bayesian networks
- relative error
- training process
- feedforward neural networks
- error bounds
- conditional probabilities
- closed form
- evolutionary algorithm
- recurrent neural networks
- data sets
- training algorithm
- search algorithm
- activation function
- normal distribution
- queueing networks
- training data
- competitive learning
- machine learning
- approximation error
- hopfield neural network