Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer.
Noam ShazeerAzalia MirhoseiniKrzysztof MaziarzAndy DavisQuoc V. LeGeoffrey E. HintonJeff DeanPublished in: ICLR (Poster) (2017)
Keyphrases
- neural network
- multi layer
- mixture model
- back propagation
- neural network model
- single layer
- sparse representation
- pattern recognition
- activation function
- neural nets
- recurrent neural networks
- fault diagnosis
- artificial neural networks
- expert advice
- fuzzy logic
- artificial intelligence
- cellular neural networks
- feedforward neural networks
- hidden nodes
- application layer
- upper layer
- network architecture
- gaussian mixture
- multilayer perceptron
- self organizing maps
- generative model
- domain specific
- bayesian networks