Login / Signup
Active Exploration via Experiment Design in Markov Chains.
Mojmir Mutny
Tadeusz Janik
Andreas Krause
Published in:
AISTATS (2023)
Keyphrases
</>
markov chain
steady state
case study
state space
random walk
active exploration
markov model
markov process
markov processes
transition probabilities
stochastic process
machine learning
support vector
design process
stationary distribution