Type 1 and 2 mixtures of Kullback-Leibler divergences as cost functions in dimensionality reduction based on similarity preservation.
John Aldo LeeEmilie RenardGuillaume BernardPierre DupontMichel VerleysenPublished in: Neurocomputing (2013)
Keyphrases
- kullback leibler
- distance measure
- dimensionality reduction
- cost function
- kl divergence
- euclidean distance
- kullback leibler divergence
- similarity measure
- cross entropy
- gaussian mixture
- low dimensional
- high dimensional
- distance function
- distance metric
- principal component analysis
- edit distance
- high dimensional data
- data points
- pattern recognition
- information theoretic
- information theory
- feature selection
- dissimilarity measure
- mixture model
- mahalanobis distance
- objective function
- feature extraction