Training Graph Neural Networks Subject to a Tight Lipschitz Constraint.
Simona Ioana JuvinaAna Antonia NeacsuJérôme RonyJean-Christophe PesquetCorneliu BurileanuIsmail Ben AyedPublished in: Trans. Mach. Learn. Res. (2024)
Keyphrases
- neural network
- training process
- training algorithm
- feedforward neural networks
- lower bound
- multi layer perceptron
- neural network training
- graph representation
- pattern recognition
- directed graph
- upper bound
- graph theoretic
- graph structure
- graph theory
- structured data
- artificial neural networks
- test set
- fuzzy logic
- radial basis function network
- neural network structure
- backpropagation algorithm
- multi layer
- recurrent networks
- weighted graph
- back propagation
- neural nets
- recurrent neural networks
- feed forward neural networks
- training phase
- graph matching
- global consistency
- pointwise
- linear constraints
- error back propagation
- graph partitioning
- graph model
- multilayer perceptron
- connected components
- self organizing maps
- training examples
- supervised learning
- training data
- learning algorithm
- genetic algorithm