Login / Signup
Training Two-Layer ReLU Networks with Gradient Descent is Inconsistent.
David Holzmüller
Ingo Steinwart
Published in:
J. Mach. Learn. Res. (2022)
Keyphrases
</>
recurrent networks
training set
social networks
multi layer
objective function
cost function
stochastic gradient descent
training phase
network structure
error function
complex networks
online learning
end to end
network analysis
data sets
loss function
training samples
network size
machine learning