MorphoActivation: Generalizing ReLU Activation Function by Mathematical Morphology.
Santiago Velasco-ForeroJesús AnguloPublished in: DGMM (2022)
Keyphrases
- mathematical morphology
- activation function
- neural network
- hidden layer
- gray scale
- feed forward
- artificial neural networks
- back propagation
- image analysis
- morphological operators
- image and signal processing
- binary images
- neural nets
- image processing
- edge detection
- network architecture
- learning rate
- multilayer perceptron
- feedforward neural networks
- basis functions
- fuzzy neural network
- morphological image processing
- spatially variant
- structuring elements
- training phase
- radial basis function
- adaptive neighborhood
- morphological filters
- multi layer perceptron
- efficient implementation
- watershed transform
- image denoising
- image enhancement
- co occurrence
- support vector machine
- multiscale
- learning algorithm
- genetic algorithm