Adaptive activation functions accelerate convergence in deep and physics-informed neural networks.
Ameya D. JagtapKenji KawaguchiGeorge Em KarniadakisPublished in: J. Comput. Phys. (2020)
Keyphrases
- activation function
- neural network
- feed forward
- artificial neural networks
- back propagation
- feed forward neural networks
- neural architecture
- hidden layer
- complex valued
- hidden neurons
- connection weights
- neural nets
- multilayer perceptron
- network architecture
- feedforward neural networks
- backpropagation algorithm
- learning rate
- hidden nodes
- radial basis function
- fuzzy logic
- training phase
- convergence speed
- multi layer perceptron
- recurrent neural networks
- pattern recognition
- machine learning
- neural network model
- genetic algorithm
- convergence rate
- artificial intelligence