Training Neural Networks With In-Memory-Computing Hardware and Multi-Level Radix-4 Inputs.
Christopher GrimmJinseok LeeNaveen VermaPublished in: IEEE Trans. Circuits Syst. I Regul. Pap. (2024)
Keyphrases
- neural network
- training process
- computing power
- training algorithm
- auto associative
- multi layer
- multi layer perceptron
- feedforward neural networks
- feed forward neural networks
- low cost
- genetic algorithm
- neural network training
- computational power
- parallel hardware
- backpropagation algorithm
- real time
- memory requirements
- neural network model
- internal memory
- pattern recognition
- multilayer perceptron
- neural network structure
- virtual memory
- fault diagnosis
- back propagation
- recurrent networks
- floating point
- artificial neural networks
- main memory
- recurrent neural networks
- memory management
- training set
- associative memory
- fuzzy logic
- activation function
- training phase
- hardware and software
- neural nets
- random access
- hardware architecture
- computer systems
- training data
- fixed point
- learning algorithm
- fourier transform