Login / Signup

How (Implicit) Regularization of ReLU Neural Networks Characterizes the Learned Function - Part II: the Multi-D Case of Two Layers with Random First Layer.

Jakob HeissJosef TeichmannHanna Wutte
Published in: CoRR (2023)
Keyphrases