SparseNN: An energy-efficient neural network accelerator exploiting input and output sparsity.
Jingyang ZhuJingbo JiangXizi ChenChi-Ying TsuiPublished in: DATE (2018)
Keyphrases
- desired output
- neural network
- hidden units
- input data
- input variables
- artificial neural networks
- multiple output
- rbf network
- wireless sensor networks
- bayes rule
- back propagation
- energy efficient
- neural network model
- input pattern
- pattern recognition
- control signals
- parallel implementation
- network architecture
- connection weights
- multiple input
- hidden layer
- number of hidden units
- nonlinear functions
- output layer
- training algorithm
- sensor networks
- high dimensional
- neural network is trained