TaLU: A Hybrid Activation Function Combining Tanh and Rectified Linear Unit to Enhance Neural Networks.
Md. Mehedi HasanMd. Ali HossainAzmain Yakin SrizonAbu SayeedPublished in: CoRR (2023)
Keyphrases
- activation function
- neural network
- artificial neural networks
- feed forward
- back propagation
- hidden layer
- multilayer perceptron
- feed forward neural networks
- hidden neurons
- backpropagation algorithm
- neural nets
- feedforward neural networks
- nonlinear functions
- chaotic neural network
- learning rate
- radial basis function
- hidden nodes
- single hidden layer
- fuzzy neural network
- pattern recognition
- sigmoid function
- network architecture
- hidden units
- multi layer perceptron
- basis functions
- extreme learning machine
- neural network model
- fault diagnosis
- genetic algorithm
- training phase
- single layer
- multi layer
- fuzzy clustering