Login / Signup
Norm-preserving Orthogonal Permutation Linear Unit Activation Functions (OPLU).
Artem N. Chernodub
Dimitri Nowicki
Published in:
CoRR (2016)
Keyphrases
</>
activation function
neural network
feed forward
artificial neural networks
complex valued
back propagation
neural architecture
hidden layer
basis functions
hidden nodes
pattern recognition
linear combination
radial basis function
network architecture
training phase
multi layer perceptron