Login / Signup
Hawkes Process Multi-armed Bandits for Search and Rescue.
Wen-Hao Chiang
George O. Mohler
Published in:
ICMLA (2022)
Keyphrases
</>
search and rescue
multi armed bandits
dynamic programming
machine learning
multi agent
mixture model
search and rescue operations