Pre-Training a Graph Recurrent Network for Language Representation.
Yile WangLinyi YangZhiyang TengMing ZhouYue ZhangPublished in: CoRR (2022)
Keyphrases
- recurrent networks
- recurrent neural networks
- graph representation
- biologically inspired
- feed forward
- representation language
- relational structures
- predicate calculus
- vector representation
- graph theory
- weighted graph
- natural language
- neural network
- connected components
- image representation
- programming language
- bipartite graph
- data sets
- training samples
- multi modal
- training process
- first order logic
- graph structures
- spatio temporal
- structured representation