Neural Networks can Learn Representations with Gradient Descent.
Alex DamianJason D. LeeMahdi SoltanolkotabiPublished in: CoRR (2022)
Keyphrases
- neural network
- learning rules
- pattern recognition
- fuzzy logic
- back propagation
- loss function
- cost function
- multiple representations
- neural network model
- multi layer perceptron
- training process
- distributed representations
- function approximators
- efficient learning
- recurrent neural networks
- feed forward
- case study
- artificial intelligence
- machine learning
- data mining