Fluid Transformers and Creative Analogies: Exploring Large Language Models' Capacity for Augmenting Cross-Domain Analogical Creativity.
Zijian DingArvind SrinivasanStephen MacNeilJoel ChanPublished in: CoRR (2023)
Keyphrases
- language model
- biologically inspired design
- cross domain
- biological systems
- language modeling
- n gram
- probabilistic model
- document retrieval
- information retrieval
- retrieval model
- knowledge transfer
- test collection
- query expansion
- language modelling
- statistical language models
- transfer learning
- sentiment classification
- language models for information retrieval
- target domain
- pseudo relevance feedback
- smoothing methods
- complex systems
- query terms
- biologically inspired
- computational models
- text categorization
- e government
- supervised learning