Fixed-point optimization of deep neural networks with adaptive step size retraining.
Sungho ShinYoonho BooWonyong SungPublished in: ICASSP (2017)
Keyphrases
- fixed point
- step size
- variable step size
- neural network
- faster convergence
- steepest descent method
- stochastic gradient descent
- line search
- convergence rate
- convergence speed
- cost function
- sufficient conditions
- temporal difference
- dynamical systems
- floating point
- variational inequalities
- gradient method
- global optimization
- back propagation
- optimization algorithm
- optimization problems
- artificial neural networks
- global convergence
- belief propagation
- boundary conditions
- wavelet coefficients
- higher order
- wavelet transform
- state space
- fixed point theorem
- multiresolution