Knowledge-Based Hierarchical POMDPs for Task Planning.
Sergio A. SerranoElizabeth SantiagoJosé Martínez-CarranzaEduardo F. MoralesLuis Enrique SucarPublished in: CoRR (2021)
Keyphrases
- partially observable markov decision processes
- stochastic domains
- belief state
- partially observable
- belief space
- predictive state representations
- planning problems
- reinforcement learning
- heuristic search
- expert systems
- planning under uncertainty
- markov decision problems
- finite state
- sequential decision making problems
- hierarchical structure
- continuous state
- hierarchical model
- action selection
- planning domains
- markov decision processes
- state space
- partially observable stochastic games
- partial observability
- planning process
- belief revision
- optimal policy
- multi agent