Hybrid In-Memory Computing Architecture for the Training of Deep Neural Networks.
Vinay JoshiWangxin HeJae-sun SeoBipin RajendranPublished in: ISCAS (2021)
Keyphrases
- neural network
- training process
- multilayer neural network
- associative memory
- network architecture
- training algorithm
- multi layer
- back propagation
- neural network structure
- feed forward neural networks
- error back propagation
- feedforward neural networks
- learning capabilities
- polynomial neural networks
- auto associative
- memory management
- hybrid intelligent
- fuzzy logic
- management system
- genetic algorithm
- backpropagation algorithm
- multi layer perceptron
- deep architectures
- fuzzy systems
- neural network model
- fuzzy neural network
- hybrid models
- training set
- pattern recognition
- neural network training
- real time
- memory hierarchy
- feedforward artificial neural networks
- recurrent networks
- multithreading
- hardware architecture
- deep learning
- radial basis function network
- design considerations
- activation function
- training phase
- feed forward
- radial basis function
- fuzzy rules
- training examples
- support vector