Login / Signup

A Joint Model Provisioning and Request Dispatch Solution for Low-Latency Inference Services on Edge.

Anish PrasadCarl MofjeldYang Peng
Published in: Sensors (2021)
Keyphrases
  • low latency
  • real time
  • data sets
  • context aware