Login / Signup
Throughput Maximization of Delay-Aware DNN Inference in Edge Computing by Exploring DNN Model Partitioning and Inference Parallelism.
Jing Li
Weifa Liang
Yuchen Li
Zichuan Xu
Xiaohua Jia
Song Guo
Published in:
IEEE Trans. Mob. Comput. (2023)
Keyphrases
</>
objective function
bayesian inference
probabilistic model
bayesian networks
bayesian model
computational model
management system
statistical model
inference engine
artificial neural networks
em algorithm
mathematical model
decision theoretic
dynamic bayesian networks
inference process