DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks.
Pál András PappKarolis MartinkusLukas FaberRoger WattenhoferPublished in: CoRR (2021)
Keyphrases
- neural network
- random walk
- pattern recognition
- graph structure
- dependency graph
- genetic algorithm
- training process
- expressive power
- feed forward
- graph theory
- artificial neural networks
- spanning tree
- back propagation
- fuzzy logic
- connected components
- directed graph
- graph representation
- clustering algorithm
- graph construction
- graph partitioning
- decision trees
- bipartite graph
- graph matching
- self organizing maps