Minimizing Latency for Multi-DNN Inference on Resource-Limited CPU-Only Edge Devices.
Tao WangTuo ShiXiulong LiuJianping WangBin LiuYingshu LiYechao ShePublished in: INFOCOM (2024)
Keyphrases
- resource limited
- embedded systems
- probabilistic inference
- data transfer
- power management
- low cost
- low latency
- edge detection
- real time
- bayesian networks
- mobile devices
- inference process
- personal computer
- bayesian inference
- edge information
- heterogeneous computing
- processing power
- edge map
- edge detector
- general purpose
- database systems
- case study