Training Neural Networks is NP-Hard in Fixed Dimension.
Vincent FroeseChristoph HertrichPublished in: CoRR (2023)
Keyphrases
- neural network
- np hard
- training algorithm
- training process
- pattern recognition
- feedforward neural networks
- feed forward neural networks
- backpropagation algorithm
- back propagation
- optimal solution
- neural network training
- scheduling problem
- multi layer perceptron
- linear programming
- integer programming
- neural network model
- fixed number
- artificial neural networks
- lower bound
- computational complexity
- minimum cost
- approximation algorithms
- online learning
- worst case
- fuzzy logic
- data sets
- network architecture
- multilayer perceptron
- test set
- training phase
- activation function
- greedy heuristic
- np hardness
- training patterns
- recurrent networks
- genetic algorithm
- machine learning
- error back propagation