A modified Adam algorithm for deep neural network optimization.
Mohamed ReyadAmany M. SarhanM. ArafaPublished in: Neural Comput. Appl. (2023)
Keyphrases
- neural network
- optimization algorithm
- stochastic gradient
- learning algorithm
- experimental evaluation
- optimal solution
- significant improvement
- np hard
- high accuracy
- worst case
- genetic algorithm
- constrained optimization
- improved algorithm
- times faster
- matching algorithm
- detection algorithm
- dynamic programming
- search space
- k means
- expectation maximization
- cost function
- optimization problems
- back propagation
- neural network model
- segmentation algorithm
- ant colony optimization
- computational cost
- objective function
- reinforcement learning
- optimization process
- similarity measure
- levenberg marquardt
- image segmentation
- neural network training