A Unifying Mutual Information View of Metric Learning: Cross-Entropy vs. Pairwise Losses.
Malik BoudiafJérôme RonyImtiaz Masud ZikoEric GrangerMarco PedersoliPablo PiantanidaIsmail Ben AyedPublished in: ECCV (6) (2020)
Keyphrases
- data mining
- metric learning
- cross entropy
- pairwise
- mutual information
- similarity measure
- information theoretic
- log likelihood
- semi supervised
- maximum likelihood
- image registration
- distance metric
- multi class
- learning tasks
- dimensionality reduction
- language modeling
- information gain
- loss function
- markov random field
- error function
- feature space
- pairwise constraints
- feature selection
- multi task
- text mining
- distance function
- pattern recognition
- search engine
- computer vision
- neural network
- evaluation metrics
- learning problems
- euclidean distance
- unsupervised learning
- feature vectors
- feature extraction
- clustering algorithm