Optimizing Objective Functions from Trained ReLU Neural Networks via Sampling.
Georgia PerakisAsterios TsiourvasPublished in: CoRR (2022)
Keyphrases
- neural network
- objective function
- multilayer perceptron
- trained neural networks
- training process
- auto associative
- multi layer perceptron
- trained neural network
- back propagation
- artificial neural networks
- backpropagation algorithm
- pattern recognition
- neural learning
- rule extraction
- random sampling
- neural nets
- elman network
- multi objective
- monte carlo
- parameter space
- hidden layer
- optimization problems
- feedforward neural networks
- genetic algorithm
- neural network model
- fault diagnosis
- radial basis function
- sample size
- sampling methods
- back propagation neural network
- sampling strategy
- fuzzy logic
- cost function
- training set
- recurrent neural networks
- activation function
- feed forward neural networks
- hidden units
- fuzzy neural network