BinaryRelax: A Relaxation Approach for Training Deep Neural Networks with Quantized Weights.
Penghang YinShuai ZhangJiancheng LyuStanley J. OsherYingyong QiJack XinPublished in: SIAM J. Imaging Sci. (2018)
Keyphrases
- neural network
- training process
- training algorithm
- feedforward neural networks
- multi layer perceptron
- neural network training
- backpropagation algorithm
- artificial neural networks
- back propagation
- pattern recognition
- feed forward neural networks
- feedforward artificial neural networks
- hidden neurons
- fuzzy logic
- error back propagation
- test set
- deep architectures
- probabilistic relaxation
- multilayer perceptron
- linearly combined
- supervised learning
- linear combination
- neural network model
- deep learning
- recurrent networks
- connection weights
- objective function
- training set
- radial basis function network
- data sets
- feed forward
- multi layer
- synaptic weights
- network architecture
- weight update
- neural network structure
- training phase
- online learning
- activation function
- nearest neighbour
- training samples
- semidefinite
- iterative algorithms