Variational Inference-Based Dropout in Recurrent Neural Networks for Slot Filling in Spoken Language Understanding.
Jun QiXu LiuJavier TejedorPublished in: CoRR (2020)
Keyphrases
- language understanding
- recurrent neural networks
- variational inference
- bayesian inference
- natural language understanding
- topic models
- posterior distribution
- probabilistic graphical models
- gaussian process
- mixture model
- latent dirichlet allocation
- probabilistic model
- variational methods
- neural network
- closed form
- language processing
- feed forward
- graphical models
- spoken dialogue systems
- dialogue system
- artificial neural networks
- exact inference
- echo state networks
- approximate inference
- general knowledge
- exponential family
- natural language
- semantic interpretation
- hyperparameters
- latent variables
- bayesian framework
- active learning
- cognitive psychology
- machine learning
- generative model
- first order logic
- language model
- parameter estimation
- knowledge representation
- prior information
- gaussian processes
- semantic analysis