Nearly Optimal VC-Dimension and Pseudo-Dimension Bounds for Deep Neural Network Derivatives.
Yahong YangHaizhao YangYang XiangPublished in: NeurIPS (2023)
Keyphrases
- vc dimension
- vapnik chervonenkis
- neural network
- vapnik chervonenkis dimension
- worst case
- upper bound
- generalization bounds
- lower bound
- sample complexity
- covering numbers
- statistical learning theory
- concept classes
- sample size
- empirical risk minimization
- distribution free
- inductive inference
- learning machines
- uniform convergence
- optimal solution
- pac learning
- compression scheme
- euclidean space
- function classes
- concept class
- data dependent
- back propagation
- theoretical analysis
- special case
- generalization ability
- large deviations
- small number
- np hard