Federated Full-Parameter Tuning of Billion-Sized Language Models with Communication Cost under 18 Kilobytes.
Zhen QinDaoyuan ChenBingchen QianBolin DingYaliang LiShuiguang DengPublished in: CoRR (2023)
Keyphrases
- parameter tuning
- communication cost
- language model
- distributed data
- language modeling
- sensor networks
- ink bleed
- information retrieval
- probabilistic model
- n gram
- parameter settings
- speech recognition
- digital libraries
- data distribution
- retrieval model
- document retrieval
- query expansion
- test collection
- document ranking
- federated search
- query terms
- language modelling
- reduce communication cost
- context sensitive
- statistical language models
- data sources
- smoothing methods
- language models for information retrieval