APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning.
Ravin KumarPublished in: CoRR (2022)
Keyphrases
- deep learning
- activation function
- neural network
- artificial neural networks
- unsupervised learning
- hidden layer
- feed forward
- learning rate
- neural nets
- back propagation
- basis functions
- machine learning
- mental models
- network architecture
- weakly supervised
- radial basis function
- multilayer perceptron
- fuzzy logic
- dimensionality reduction
- artificial intelligence
- pattern recognition
- supervised learning
- model selection
- image processing
- rbf neural network
- fuzzy neural network
- object detection
- multi layer perceptron
- genetic algorithm