• search
    search
  • reviewers
    reviewers
  • feeds
    feeds
  • assignments
    assignments
  • settings
  • logout

Demand Layering for Real-Time DNN Inference with Minimized Memory Usage.

Mingoo JiSaehanseul YiChangjin KooSol AhnDongjoo SeoNikil D. DuttJong-Chan Kim
Published in: RTSS (2022)
Keyphrases
  • memory usage
  • real time
  • memory requirements
  • memory footprint
  • bayesian networks
  • high speed
  • vision system
  • real time systems
  • low cost
  • probabilistic inference
  • lead time
  • bayesian model
  • inference mechanism