Depth Degeneracy in Neural Networks: Vanishing Angles in Fully Connected ReLU Networks on Initialization.
Cameron JakubMihai NicaPublished in: CoRR (2023)
Keyphrases
- fully connected
- neural network
- activation function
- scale free
- conditional random fields
- pattern recognition
- artificial neural networks
- feed forward
- network model
- multilayer perceptron
- back propagation
- fault diagnosis
- recurrent neural networks
- neural nets
- multi layer perceptron
- information extraction
- hidden markov models
- computer vision