PIE: A Pipeline Energy-Efficient Accelerator for Inference Process in Deep Neural Networks.
Yangyang ZhaoQi YuXuda ZhouXuehai ZhouXi LiChao WangPublished in: ICPADS (2016)
Keyphrases
- energy efficient
- inference process
- neural network
- wireless sensor networks
- energy consumption
- sensor networks
- data dissemination
- bayesian inference
- base station
- energy efficiency
- multi hop
- multi core architecture
- parallel implementation
- data gathering
- data aggregation
- low overhead
- data transmission
- multi layer
- routing protocol
- routing algorithm
- field programmable gate array
- data collection