A theory of high dimensional regression with arbitrary correlations between input features and target functions: sample complexity, multiple descent curves and a hierarchy of phase transitions.
Gabriel MelSurya GanguliPublished in: ICML (2021)
Keyphrases
- sample complexity
- input features
- phase transition
- high dimensional
- feature values
- vc dimension
- learning problems
- learning algorithm
- machine learning algorithms
- active learning
- theoretical analysis
- supervised learning
- special case
- upper bound
- training examples
- generalization error
- classification algorithm
- dimensionality reduction
- low dimensional
- lower bound
- genetic programming
- model selection
- data sets
- nearest neighbor
- high dimensional data
- data points
- sample size
- training set
- feature space
- support vector
- optimal solution
- training data
- feature extraction
- machine learning