Accelerating Training and Inference of Graph Neural Networks with Fast Sampling and Pipelining.
Tim KalerNickolas StathasAnne OuyangAlexandros-Stavros IliopoulosTao B. SchardlCharles E. LeisersonJie ChenPublished in: MLSys (2022)
Keyphrases
- neural network
- training process
- training algorithm
- feedforward neural networks
- structured prediction
- pattern recognition
- gibbs sampler
- backpropagation algorithm
- feed forward neural networks
- bayesian inference
- markov chain monte carlo
- neural network training
- back propagation
- random walk
- svm training
- multilayer perceptron
- graph representation
- fuzzy logic
- multi layer perceptron
- graph model
- error back propagation
- graph theory
- neural nets
- directed graph
- structured data
- bipartite graph
- probabilistic inference
- spanning tree
- neural network structure
- training set
- graph structure
- recurrent networks
- monte carlo
- genetic algorithm
- artificial neural networks
- probability distribution
- supervised learning
- generative model
- recurrent neural networks
- belief networks
- random sampling