Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer.
Noam ShazeerAzalia MirhoseiniKrzysztof MaziarzAndy DavisQuoc V. LeGeoffrey E. HintonJeff DeanPublished in: CoRR (2017)
Keyphrases
- neural network
- multi layer
- mixture model
- single layer
- pattern recognition
- neural network model
- artificial neural networks
- fuzzy logic
- expert finding
- artificial intelligence
- back propagation
- domain experts
- feed forward neural networks
- genetic algorithm
- application layer
- neural nets
- logit model
- feed forward
- gaussian mixture model
- expectation maximization
- human experts
- self organizing maps
- sparse representation
- knowledge acquisition
- expert advice
- data sets