Login / Signup
Kullback-Leibler Divergence and Akaike Information Criterion in General Hidden Markov Models.
Cheng-Der Fuh
Chu-Lan Michael Kao
Tianxiao Pang
Published in:
IEEE Trans. Inf. Theory (2024)
Keyphrases
</>
hidden markov models
kullback leibler divergence
mutual information
conditional random fields
information theoretic
akaike information criterion
machine learning
information theory
support vector
probability density function