Direct Quantization for Training Highly Accurate Low Bit-width Deep Neural Networks.
Tuan HoangThanh-Toan DoTam V. NguyenNgai-Man CheungPublished in: IJCAI (2020)
Keyphrases
- highly accurate
- neural network
- training process
- training algorithm
- feedforward neural networks
- high accuracy
- high quality
- feed forward neural networks
- neural network training
- capable of producing
- uniform quantization
- genetic algorithm
- recurrent networks
- accurate models
- multi layer perceptron
- backpropagation algorithm
- recurrent neural networks
- training patterns
- deep architectures
- training phase
- neural network model
- back propagation
- deep learning
- pattern recognition
- successive approximation
- adaptive quantization
- radial basis function network
- self organizing maps
- online learning
- fuzzy logic
- artificial neural networks
- data sets
- neural nets
- feed forward
- neural network structure
- active learning