GNNear: Accelerating Full-Batch Training of Graph Neural Networks with near-Memory Processing.
Zhe ZhouCong LiXuechao WeiXiaoyang WangGuangyu SunPublished in: PACT (2022)
Keyphrases
- neural network
- training process
- training algorithm
- feedforward neural networks
- graph theory
- batch processing
- random access
- data processing
- neural network training
- batch mode
- computational power
- real time
- auto associative
- feed forward neural networks
- multi layer perceptron
- error back propagation
- backpropagation algorithm
- graph representation
- training examples
- recurrent neural networks
- pattern recognition
- graph structure
- weighted graph
- training set
- training phase
- supervised learning
- fuzzy logic
- back propagation
- information processing
- random walk
- memory space
- genetic algorithm
- processing elements
- neural network structure
- memory usage
- associative memory
- graph theoretic
- graph partitioning
- test set
- spanning tree
- graph databases
- graph search
- fault diagnosis
- connected components
- artificial neural networks
- multi layer
- recurrent networks
- neural network model
- social networks
- feed forward