Training Fully Connected Neural Networks is ∃R-Complete.
Daniel BertschingerChristoph HertrichPaul JungeblutTillmann MiltzowSimon WeberPublished in: NeurIPS (2023)
Keyphrases
- fully connected
- neural network
- training process
- activation function
- feedforward neural networks
- multi layer perceptron
- artificial neural networks
- training phase
- scale free
- conditional random fields
- supervised learning
- training set
- pattern recognition
- backpropagation algorithm
- back propagation
- training examples
- radial basis function
- feed forward
- training data