SHED: A Newton-type algorithm for federated learning based on incremental Hessian eigenvector sharing.
Nicolò Dal FabbroSubhrakanti DeyMichele RossiLuca SchenatoPublished in: Autom. (2024)
Keyphrases
- learning algorithm
- incremental learning
- single pass
- computational complexity
- learning phase
- cost function
- optimal solution
- optimization algorithm
- supervised learning
- learning speed
- detection algorithm
- segmentation algorithm
- computational cost
- np hard
- online learning
- dynamic programming
- significant improvement
- clustering method
- k means
- preprocessing
- objective function
- machine learning
- neural network
- worst case
- principal component analysis
- convergence rate
- version space