Direct Quantization for Training Highly Accurate Low Bit-width Deep Neural Networks.
Tuan HoangThanh-Toan DoTam V. NguyenNgai-Man CheungPublished in: CoRR (2020)
Keyphrases
- highly accurate
- neural network
- training process
- training algorithm
- feed forward neural networks
- high quality
- feedforward neural networks
- multi layer perceptron
- backpropagation algorithm
- back propagation
- capable of producing
- uniform quantization
- accurate models
- artificial neural networks
- neural network training
- training patterns
- high accuracy
- fuzzy logic
- deep architectures
- error back propagation
- pattern recognition
- feed forward
- supervised learning
- recurrent networks
- multi layer
- neural network structure
- adaptive quantization
- successive approximation
- training phase
- recurrent neural networks
- self organizing maps
- training examples
- training set
- training data
- quantization error
- neural nets
- radial basis function
- fault diagnosis
- test set