Incremental Training of Graph Neural Networks on Temporal Graphs under Distribution Shift.
Lukas GalkeIacopo VaglianoAnsgar ScherpPublished in: CoRR (2020)
Keyphrases
- neural network
- graph representation
- training process
- graph theory
- directed graph
- graph mining
- graph matching
- graph structure
- graph theoretic
- weighted graph
- training algorithm
- graph structures
- labeled graphs
- graph databases
- adjacency matrix
- graph model
- graph classification
- random graphs
- graph theoretical
- graph construction
- graph clustering
- bipartite graph
- structural pattern recognition
- graph representations
- pattern recognition
- series parallel
- graph properties
- subgraph isomorphism
- graph partitioning
- undirected graph
- graph isomorphism
- feedforward neural networks
- graph kernels
- graph search
- spanning tree
- graph data
- power law
- real world graphs
- edge weights
- web graph
- degree distribution
- dynamic graph
- maximum clique
- minimum spanning tree
- temporal reasoning
- multi layer perceptron
- graph patterns
- connected graphs
- structured data
- maximum common subgraph
- query graph
- community discovery
- back propagation
- random variables
- multilayer perceptron
- incremental learning
- directed acyclic graph
- bounded treewidth
- recurrent neural networks
- training set
- finding the shortest path
- temporal information
- directed acyclic
- artificial neural networks
- random walk
- dense subgraphs
- graph layout
- training data
- massive graphs
- densely connected
- input pattern
- adjacency graph
- connected components
- neural network model
- attributed graphs