How Contextual are Contextualized Word Representations? Comparing the Geometry of BERT, ELMo, and GPT-2 Embeddings.
Kawin EthayarajhPublished in: EMNLP/IJCNLP (1) (2019)
Keyphrases
- contextual information
- co occurrence
- n gram
- vector space
- three dimensional
- word clouds
- higher level
- low dimensional
- context sensitive
- symbolic representation
- context aware
- dimensionality reduction
- euclidean space
- geometric information
- word segmentation
- geometric models
- multiple representations
- related words
- euclidean geometry
- machine learning