Distributed Inference Acceleration with Adaptive DNN Partitioning and Offloading.
Thaha MohammedCarlee Joe-WongRohit BabbarMario Di FrancescoPublished in: INFOCOM (2020)
Keyphrases
- distributed systems
- load balance
- distributed environment
- inference engine
- cooperative
- bayesian inference
- inference process
- data sets
- bayesian networks
- distributed architecture
- probabilistic inference
- distributed data
- distributed network
- master slave
- bayesian model
- agent technology
- training process
- computing environments
- multi agent
- genetic algorithm
- information retrieval