Login / Signup
Effects of Depth, Width, and Initialization: A Convergence Analysis of Layer-wise Training for Deep Linear Neural Networks.
Yeonjong Shin
Published in:
CoRR (2019)
Keyphrases
</>
convergence analysis
neural network
training process
training algorithm
multi layer
global convergence
training set
pairwise
linear svm
optimality conditions
back propagation
convergence rate
dynamic programming
linear constraints