Enabling Resistive-RAM-based Activation Functions for Deep Neural Network Acceleration.
Zihan ZhangTaozhong LiNing GuanQin WangGuanghui HeWeiguang ShengZhigang MaoNaifeng JingPublished in: ACM Great Lakes Symposium on VLSI (2020)
Keyphrases
- activation function
- neural network
- feed forward neural networks
- artificial neural networks
- feed forward
- back propagation
- hidden layer
- hidden neurons
- neural architecture
- connection weights
- neural nets
- learning rate
- hidden nodes
- multilayer perceptron
- feedforward neural networks
- network architecture
- neural network model
- radial basis function
- multi layer perceptron
- basis functions
- training phase
- genetic algorithm
- sigmoid function
- multi layer
- fuzzy neural network
- fuzzy logic
- input space
- input output
- small number
- machine learning