RPR: Random Partition Relaxation for Training; Binary and Ternary Weight Neural Networks.
Lukas CavigelliLuca BeniniPublished in: CoRR (2020)
Keyphrases
- neural network
- training process
- training algorithm
- feedforward neural networks
- feed forward neural networks
- pattern recognition
- backpropagation algorithm
- back propagation
- multi layer perceptron
- weight update
- training patterns
- test set
- neural network training
- neural network structure
- training set
- training phase
- artificial neural networks
- binary tree
- fuzzy logic
- multilayer perceptron
- feed forward
- iterative algorithms
- radial basis function network
- error back propagation
- hamming distance
- training data
- convex relaxation
- weight vector
- non binary
- weighting scheme
- fuzzy systems
- multi layer
- recurrent neural networks
- online learning