Sharp asymptotics on the compression of two-layer neural networks.
Mohammad Hossein AmaniSimone BombariMarco MondelliRattana PukdeeStefano RiniPublished in: CoRR (2022)
Keyphrases
- neural network
- multi layer
- single layer
- sufficient conditions
- pattern recognition
- compression algorithm
- compression scheme
- data compression
- back propagation
- compression ratio
- fuzzy logic
- image compression
- genetic algorithm
- neural network model
- neural nets
- feed forward neural networks
- artificial neural networks
- markov chain
- activation function
- synaptic weights
- machine learning
- high quality
- image sequences
- self organizing maps
- data sets
- fault diagnosis
- image processing
- fuzzy systems
- feed forward
- rule extraction
- multilayer perceptron
- competitive learning
- lossy compression
- heavy traffic
- application layer
- multiple layers