LinSyn: Synthesizing Tight Linear Bounds for Arbitrary Neural Network Activation Functions.
Brandon PaulsenChao WangPublished in: TACAS (1) (2022)
Keyphrases
- activation function
- neural network
- upper bound
- lower bound
- hidden layer
- feed forward neural networks
- feed forward
- artificial neural networks
- neural architecture
- back propagation
- hidden neurons
- multilayer perceptron
- nonlinear functions
- connection weights
- neural nets
- feedforward neural networks
- network architecture
- learning rate
- sigmoid function
- hidden nodes
- fuzzy neural network
- training phase
- neural network model
- radial basis function
- learning algorithm
- machine learning
- multi layer perceptron
- basis functions
- input output
- fault diagnosis
- support vector machine svm
- genetic algorithm