BinaryRelax: A Relaxation Approach For Training Deep Neural Networks With Quantized Weights.
Penghang YinShuai ZhangJiancheng LyuStanley J. OsherYingyong QiJack XinPublished in: CoRR (2018)
Keyphrases
- weighting scheme
- neural network
- training process
- training algorithm
- multi layer perceptron
- feed forward neural networks
- feedforward neural networks
- backpropagation algorithm
- pattern recognition
- artificial neural networks
- test set
- neural network training
- back propagation
- training set
- error back propagation
- training phase
- recurrent neural networks
- deep architectures
- feed forward
- multi layer
- activation function
- recurrent networks
- connection weights
- weighted sum
- probabilistic relaxation
- multilayer neural network
- extreme learning machine
- iterative algorithms
- radial basis function network
- hidden layer
- self organizing maps
- linear combination
- training data
- decision trees
- feature selection