On Provable Benefits of Depth in Training Graph Convolutional Networks.
Weilin CongMorteza RamezaniMehrdad MahdaviPublished in: NeurIPS (2021)
Keyphrases
- dynamic networks
- average degree
- graph theory
- training examples
- restricted boltzmann machine
- directed graph
- small world
- recurrent networks
- random walk
- fully connected
- graph model
- training process
- graph structure
- complex networks
- network structure
- structured data
- supervised learning
- graph structures
- citation networks
- social networks
- highly connected
- graph representation
- network analysis
- connected components
- neural network
- heterogeneous networks
- undirected graph
- spanning tree
- weighted graph
- training set
- echo state networks
- betweenness centrality
- training data
- convolutional neural networks