AdaGDA: Faster Adaptive Gradient Descent Ascent Methods for Minimax Optimization.
Feihu HuangXidong WuZhengmian HuPublished in: AISTATS (2023)
Keyphrases
- neural network
- optimization methods
- cost function
- significant improvement
- global optimization
- computational cost
- computationally expensive
- empirical studies
- optimization approaches
- iterative methods
- qualitative and quantitative
- machine learning methods
- loss function
- machine learning algorithms
- optimization algorithm
- optimization problems
- multi objective
- pairwise
- support vector
- image processing
- data mining