Save It All: Enabling Full Parameter Tuning for Federated Large Language Models via Cycle Black Gradient Descent.
Lin WangZhichao WangXiaoying TangPublished in: CoRR (2024)
Keyphrases
- parameter tuning
- language model
- language modeling
- ink bleed
- n gram
- probabilistic model
- document retrieval
- speech recognition
- language modelling
- statistical language models
- parameter settings
- information retrieval
- query expansion
- retrieval model
- smoothing methods
- digital libraries
- ad hoc information retrieval
- term dependencies
- vector space model
- language models for information retrieval
- relevance model
- context sensitive
- machine learning
- evolutionary algorithm
- model selection
- data mining
- retrieval systems
- test collection
- search space
- maximum likelihood
- language model for information retrieval
- simulated annealing