Synthesizing Context-free Grammars from Recurrent Neural Networks.
Daniel M. YellinGail WeissPublished in: TACAS (1) (2021)
Keyphrases
- recurrent neural networks
- context free grammars
- grammatical inference
- context free languages
- neural network
- feed forward
- recurrent networks
- context free
- reservoir computing
- regular expressions
- grammar induction
- attribute grammars
- echo state networks
- artificial neural networks
- cascade correlation
- predicate invention
- covering arrays
- neural model
- regular languages
- production rules
- lexical semantics
- xml schema
- nonlinear dynamic systems
- back propagation
- expert systems
- tree adjoining
- databases
- tree automata
- finite automata
- nonlinear systems
- data mining