Log Likelihood Spectral Distance, Entropy Rate Power, and Mutual Information with Applications to Speech Coding.
Jerry D. GibsonPreethi MahadevanPublished in: Entropy (2017)
Keyphrases
- log likelihood
- mutual information
- information theoretic
- linear prediction
- kullback leibler divergence
- information theory
- relative entropy
- linear predictive coding
- code length
- spectral features
- image registration
- exponential family
- feature selection
- speech signal
- distance measure
- similarity measure
- maximum likelihood
- distance function
- information gain
- density estimation
- data mining
- normalized mutual information
- shannon entropy
- scoring function
- feature space
- search engine