Training Graph Neural Networks with 1000 Layers.
Guohao LiMatthias MüllerBernard GhanemVladlen KoltunPublished in: ICML (2021)
Keyphrases
- neural network
- training process
- feed forward neural networks
- multi layer
- training algorithm
- feedforward neural networks
- backpropagation algorithm
- multi layer perceptron
- error back propagation
- single layer
- pattern recognition
- weighted graph
- training set
- activation function
- training phase
- back propagation
- graph theory
- structured data
- test set
- random walk
- fuzzy logic
- recurrent neural networks
- neural network model
- neural network structure
- radial basis function
- neural network training
- directed graph
- supervised learning
- graph theoretic
- bipartite graph
- artificial neural networks
- neural nets
- data sets
- recurrent networks
- graph partitioning
- training examples
- graph structure
- feed forward
- connected components