A case where a spindly two-layer linear network decisively outperforms any neural network with a fully connected input layer.
Manfred K. WarmuthWojciech KotlowskiEhsan AmidPublished in: ALT (2021)
Keyphrases
- fully connected
- output layer
- neural network
- hidden layer
- single layer
- activation function
- multi layer
- feed forward
- scale free
- input output
- back propagation
- artificial neural networks
- network model
- rbf neural network
- multi layer perceptron
- conditional random fields
- synaptic weights
- learning rate
- input data
- bayesian networks