Dimensionality reduction via the Johnson-Lindenstrauss Lemma: theoretical and empirical bounds on embedding dimension.
John FedorukByron SchmulandJulia Ann JohnsonGiseon HeoPublished in: J. Supercomput. (2018)
Keyphrases
- johnson lindenstrauss
- dimensionality reduction
- high dimensional
- nonlinear dimensionality reduction
- fat shattering
- graph embedding
- structure preserving
- theoretical analysis
- low dimensional
- convex combinations
- embedding space
- high dimensional data
- locality preserving projections
- principal component analysis
- multidimensional scaling
- manifold learning
- low dimensional spaces
- neighborhood preserving
- dimensional data
- upper bound
- empirical risk minimization
- vapnik chervonenkis dimension
- confidence bounds
- high dimensionality
- covering numbers
- lower bound
- rademacher complexity
- upper and lower bounds
- data representation
- subspace learning
- dimensionality reduction methods
- finite sample
- locally linear embedding
- vector space
- vc dimension
- data points
- variance reduction
- theoretical considerations
- statistical learning theory
- pattern recognition and machine learning
- lower and upper bounds
- lower dimensional
- principal components
- error bounds
- linear discriminant analysis
- feature space
- pattern recognition
- feature extraction
- computer vision
- metric learning
- euclidean space
- input space