Transformer Working Memory Enables Regular Language Reasoning and Natural Language Length Extrapolation.
Ta-Chung ChiTing-Han FanAlexander I. RudnickyPeter J. RamadgePublished in: CoRR (2023)
Keyphrases
- natural language
- working memory
- knowledge representation
- cognitive load
- long term memory
- computational model
- information processing
- language processing
- cognitive architecture
- focus of attention
- language understanding
- short term memory
- individual differences
- working memory capacity
- dialogue system
- natural language understanding
- machine learning
- data mining
- knowledge acquisition
- learning environment
- knowledge base
- expert systems
- functional connectivity