A Lyapunov-type approach to convergence of the Douglas-Rachford algorithm for a nonconvex setting.
Minh N. DaoMatthew K. TamPublished in: J. Glob. Optim. (2019)
Keyphrases
- learning algorithm
- dynamic programming
- cost function
- preprocessing
- iterative algorithms
- objective function
- computational cost
- detection algorithm
- convergence rate
- search space
- computational complexity
- faster convergence
- simulated annealing
- expectation maximization
- recognition algorithm
- neural network
- globally convergent
- convergence property
- stochastic approximation
- segmentation algorithm
- linear programming
- worst case
- probabilistic model
- significant improvement
- optimal solution