RSC: Accelerate Graph Neural Networks Training via Randomized Sparse Computations.
Zirui LiuShengyuan ChenKaixiong ZhouDaochen ZhaXiao HuangXia HuPublished in: ICML (2023)
Keyphrases
- neural network
- training process
- training algorithm
- feedforward neural networks
- feed forward neural networks
- neural network training
- multi layer perceptron
- back propagation
- pattern recognition
- backpropagation algorithm
- graph theory
- graph representation
- neural network structure
- recurrent networks
- graph theoretic
- fuzzy logic
- recurrent neural networks
- gaussian graphical models
- graph mining
- signal recovery
- graph structure
- training data
- artificial neural networks
- structured data
- test set
- genetic algorithm
- learning algorithm
- avoid overfitting
- high dimensional
- supervised learning
- random walk
- training samples
- training examples
- bipartite graph
- spanning tree
- graph model
- neural network model
- sparse coding
- directed acyclic graph
- neural nets
- multi layer
- weighted graph