TIME: A Training-in-Memory Architecture for RRAM-Based Deep Neural Networks.
Ming ChengLixue XiaZhenhua ZhuYi CaiYuan XieYu WangHuazhong YangPublished in: IEEE Trans. Comput. Aided Des. Integr. Circuits Syst. (2019)
Keyphrases
- neural network
- training process
- associative memory
- training algorithm
- network architecture
- neural network structure
- feedforward neural networks
- feed forward neural networks
- auto associative
- multi layer
- error back propagation
- deep architectures
- memory hierarchy
- multi layer perceptron
- backpropagation algorithm
- fuzzy logic
- pattern recognition
- training samples
- test set
- management system
- training examples
- neural network model
- real time
- computing power
- training phase
- memory usage
- software architecture
- training set
- neural network training
- processing elements
- memory management
- operating system
- back propagation
- processing units
- self organizing maps
- memory size
- memory access
- hardware architecture
- random access
- main memory
- memory requirements
- fuzzy systems