HSB-GDM: a Hybrid Stochastic-Binary Circuit for Gradient Descent with Momentum in the Training of Neural Networks.
Han LiHeng ShiHonglan JiangSiting LiuPublished in: NANOARCH (2022)
Keyphrases
- neural network
- training algorithm
- training process
- multi layer perceptron
- feedforward neural networks
- back propagation
- artificial neural networks
- neural network training
- feed forward neural networks
- learning rate
- multilayer neural network
- pattern recognition
- cost function
- loss function
- high speed
- training samples
- training examples
- hybrid intelligent
- training phase
- backpropagation algorithm
- electronic circuits
- feed forward
- fault diagnosis
- objective function
- fuzzy logic
- monte carlo
- training set
- supervised learning
- error function
- hopfield neural network
- analog circuits
- hybrid models
- decision making
- multi layer
- support vector machine
- recurrent networks
- image quality
- online learning
- hidden layer