Tabula: Efficiently Computing Nonlinear Activation Functions for Secure Neural Network Inference.
Maximilian LamMichael MitzenmacherVijay Janapa ReddiGu-Yeon WeiDavid BrooksPublished in: CoRR (2022)
Keyphrases
- activation function
- efficiently computing
- neural network
- artificial neural networks
- back propagation
- nonlinear functions
- feed forward
- feed forward neural networks
- hidden layer
- neural architecture
- neural nets
- connection weights
- multilayer perceptron
- hidden neurons
- network architecture
- learning rate
- sigmoid function
- hidden nodes
- fuzzy neural network
- multi layer perceptron
- feedforward neural networks
- neural model
- pattern recognition
- recurrent neural networks
- basis functions
- random fields
- parametric models
- training phase
- neural network model
- radial basis function
- data analysis
- bayesian networks
- genetic algorithm