Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence.
Yung-Kyun NohMasashi SugiyamaSong LiuMarthinus Christoffel du PlessisFrank Chongwoo ParkDaniel D. LeePublished in: AISTATS (2014)
Keyphrases
- metric learning
- kullback leibler divergence
- nearest neighbor
- distance function
- probability density function
- distance metric
- distance measure
- knn
- information theoretic
- mutual information
- pairwise
- dimensionality reduction
- high dimensional
- data points
- mahalanobis distance
- learning tasks
- semi supervised
- density estimation
- k nearest neighbor
- information theory
- multi task
- high dimensional data
- feature space
- probability density
- training set
- diffusion tensor
- machine learning
- euclidean distance
- neural network
- data sets
- low dimensional
- marginal distributions
- feature selection
- similarity measure
- learning algorithm
- image processing
- training data
- unlabeled data
- bayesian networks
- supervised learning
- probability distribution