WETM: A word embedding-based topic model with modified collapsed Gibbs sampling for short text.
Junaid RashidJungeun KimAmir HussainUsman NaseemPublished in: Pattern Recognit. Lett. (2023)
Keyphrases
- latent topics
- short text
- topic models
- collapsed gibbs sampling
- latent dirichlet allocation
- variational bayesian inference
- topic modeling
- gibbs sampling
- text documents
- variational inference
- text mining
- co occurrence
- generative model
- latent variables
- probabilistic model
- probabilistic topic models
- bayesian framework
- machine learning
- active learning
- pairwise
- information retrieval