Transformer Working Memory Enables Regular Language Reasoning And Natural Language Length Extrapolation.
Ta-Chung ChiTing-Han FanAlexander RudnickyPeter J. RamadgePublished in: EMNLP (Findings) (2023)
Keyphrases
- working memory
- natural language
- cognitive load
- knowledge representation
- computational model
- long term memory
- information processing
- language processing
- cognitive architecture
- focus of attention
- individual differences
- language understanding
- short term memory
- knowledge base
- machine learning
- dialogue system
- working memory capacity
- multimedia
- short term
- functional connectivity
- databases
- prefrontal cortex
- natural language understanding
- expert systems
- reinforcement learning
- artificial intelligence
- neural network