Attention weights accurately predict language representations in the brain.
Mathis LamarreCatherine ChenFatma DenizPublished in: EMNLP (Findings) (2022)
Keyphrases
- language processing
- relative importance
- programming language
- human brain
- machine learning
- linear combination
- higher level
- natural language
- object oriented
- language learning
- visual attention
- knowledge base
- magnetic resonance imaging
- information retrieval
- neural network
- object oriented programming
- physiological parameters