Training Deep Neural Networks with Joint Quantization and Pruning of Weights and Activations.
Xinyu ZhangIan ColbertKenneth Kreutz-DelgadoSrinjoy DasPublished in: CoRR (2021)
Keyphrases
- neural network
- training process
- training algorithm
- feed forward neural networks
- neural network training
- feedforward neural networks
- back propagation
- artificial neural networks
- avoid overfitting
- backpropagation algorithm
- training set
- neural network structure
- genetic algorithm
- weighted sum
- error back propagation
- multi layer
- multilayer perceptron
- feed forward
- deep architectures
- training data
- hidden neurons
- activation function
- training phase
- recurrent neural networks
- linearly combined
- training examples
- feedforward artificial neural networks
- fuzzy logic
- computational complexity
- pattern recognition
- number of hidden units
- connection weights
- pruning method
- radial basis function network
- network architecture
- neural network model
- training samples
- supervised learning