Information Theoretic Sample Complexity Lower Bound for Feed-Forward Fully-Connected Deep Networks.
Xiaochen YangJean HonorioPublished in: CoRR (2020)
Keyphrases
- information theoretic
- fully connected
- sample complexity
- feed forward
- lower bound
- activation function
- upper bound
- mutual information
- back propagation
- neural nets
- neural network
- artificial neural networks
- hidden layer
- theoretical analysis
- vc dimension
- generalization error
- pac learning
- conditional random fields
- learning problems
- special case
- scale free
- recurrent neural networks
- learning algorithm
- active learning
- supervised learning
- optimal solution
- training examples
- worst case
- sample size
- social networks
- data mining
- image processing
- feature selection
- learning rate
- model selection
- higher order
- information extraction
- objective function
- similarity measure