b-tanh: Low Hardware Complexity Activation Functions for LSTM.
Yuan ZhangLele PengLianghua QuanShubin ZhengQiufeng FengYonggang ZhangHui ChenPublished in: ISOCC (2022)
Keyphrases
- activation function
- network size
- neural architecture
- feed forward
- neural network
- feed forward neural networks
- hidden layer
- radial basis function
- neural nets
- learning rate
- real time
- hardware implementation
- recurrent neural networks
- evolutionary algorithm
- network architecture
- support vector
- artificial intelligence
- learning algorithm
- hidden nodes