Generalized Knowledge Distillation for Topic Models.
Kohei WatanabeKoji EguchiPublished in: PRICAI (2) (2023)
Keyphrases
- topic models
- knowledge base
- latent dirichlet allocation
- topic modeling
- knowledge representation
- text mining
- text documents
- generative model
- hierarchical bayesian model
- probabilistic model
- databases
- co occurrence
- prior knowledge
- expert systems
- gibbs sampling
- latent variables
- variational inference
- artificial intelligence