APALU: A Trainable, Adaptive Activation Function for Deep Learning Networks.
Barathi SubramanianRathinaraja JeyarajAkhrorjon Akhmadjon Ugli RakhmonovJeonghong KimPublished in: CoRR (2024)
Keyphrases
- deep learning
- activation function
- network size
- neural network
- hidden layer
- feed forward
- back propagation
- artificial neural networks
- unsupervised learning
- machine learning
- neural nets
- multilayer perceptron
- basis functions
- network structure
- radial basis function
- mental models
- weakly supervised
- image segmentation
- supervised learning
- network architecture