Towards Fast and Energy-Efficient Binarized Neural Network Inference on FPGA.
Cheng FuShilin ZhuHao SuChing-En LeeJishen ZhaoPublished in: FPGA (2019)
Keyphrases
- energy efficient
- neural network
- wireless sensor networks
- energy consumption
- sensor networks
- base station
- data dissemination
- multi core architecture
- multi hop
- routing protocol
- high speed
- data aggregation
- hardware implementation
- energy efficiency
- routing algorithm
- low overhead
- field programmable gate array
- real time
- data processing
- data transmission
- low cost
- data gathering
- data sets
- hardware architecture