A case where a spindly two-layer linear network whips any neural network with a fully connected input layer.
Manfred K. WarmuthWojciech KotlowskiEhsan AmidPublished in: CoRR (2020)
Keyphrases
- fully connected
- output layer
- neural network
- single layer
- hidden layer
- multi layer
- activation function
- feed forward
- artificial neural networks
- scale free
- network model
- rbf network
- rbf neural network
- radial basis function
- multi layer perceptron
- back propagation
- network architecture
- recurrent neural networks
- input output
- genetic algorithm
- multilayer perceptron
- high throughput
- fuzzy neural network
- conditional random fields
- markov random field
- semi supervised
- bayesian networks
- information retrieval