Nearly Optimal VC-Dimension and Pseudo-Dimension Bounds for Deep Neural Network Derivatives.
Yahong YangHaizhao YangYang XiangPublished in: CoRR (2023)
Keyphrases
- vc dimension
- vapnik chervonenkis
- neural network
- vapnik chervonenkis dimension
- worst case
- upper bound
- generalization bounds
- lower bound
- covering numbers
- sample size
- distribution free
- sample complexity
- concept classes
- statistical learning theory
- uniform convergence
- inductive inference
- learning machines
- compression scheme
- empirical risk minimization
- function classes
- optimal solution
- euclidean space
- concept class
- learning theory
- upper and lower bounds
- pac learning
- learning algorithm
- large deviations
- active learning
- feature space
- computational complexity