Login / Signup

Hawkes Process Multi-armed Bandits for Search and Rescue.

Wen-Hao ChiangGeorge O. Mohler
Published in: ICMLA (2022)
Keyphrases
  • search and rescue
  • multi armed bandits
  • dynamic programming
  • machine learning
  • multi agent
  • mixture model
  • search and rescue operations