Hybrid In-memory Computing Architecture for the Training of Deep Neural Networks.
Vinay JoshiWangxin HeJae-sun SeoBipin RajendranPublished in: CoRR (2021)
Keyphrases
- neural network
- training process
- associative memory
- multi layer
- training algorithm
- multilayer neural network
- neural network structure
- network architecture
- multi layer perceptron
- feed forward neural networks
- feedforward neural networks
- polynomial neural networks
- back propagation
- error back propagation
- fuzzy logic
- auto associative
- hybrid intelligent
- neural network training
- neural nets
- pattern recognition
- training examples
- software architecture
- training set
- artificial neural networks
- real time
- training samples
- deep architectures
- computing power
- feed forward
- neural network model
- feedforward artificial neural networks
- backpropagation algorithm
- memory management
- memory requirements
- recurrent neural networks
- fuzzy systems
- recurrent networks
- processing elements
- design considerations
- learning capabilities
- multilayer perceptron
- fuzzy neural network
- genetic algorithm