Verifying Equivalence Properties of Neural Networks with ReLU Activation Functions.
Marko Kleine BüningPhilipp KernCarsten SinzPublished in: CP (2020)
Keyphrases
- activation function
- neural network
- feed forward
- back propagation
- hidden layer
- neural architecture
- artificial neural networks
- feed forward neural networks
- multilayer perceptron
- hidden neurons
- radial basis function
- neural nets
- feedforward neural networks
- network architecture
- learning rate
- connection weights
- hidden nodes
- genetic algorithm
- fuzzy neural network
- pattern recognition
- bp neural network
- recurrent neural networks
- neural network model
- training phase
- multi layer perceptron
- multi layer
- fault diagnosis
- training data
- decision making