潜在Dirichlet分配
计算机科学
词(群论)
文字嵌入
人工智能
自然语言处理
主题模型
代表(政治)
吉布斯抽样
嵌入
语言学
贝叶斯概率
哲学
政治
政治学
法学
作者
Sorawan Limwattana,Santitham Prom-on
出处
期刊:International Joint Conference on Computer Science and Software Engineering
日期:2021-06-30
被引量:2
标识
DOI:10.1109/jcsse53117.2021.9493816
摘要
Latent Dirichlet Allocation(LDA) is one of the powerful techniques in extracting topics from a document. The original LDA takes the Bag-of-Word representation as the input and produces topic distributions in documents as output. The drawback of Bag-of-Word is that it represents each word with a plain one-hot encoding which does not encode the word level information. Later research in Natural Language Processing(NLP) demonstrate that word embeddings technique such as Skipgram model can provide a good representation in capturing the relationship and semantic information between words. In recent studies, many NLP tasks could gain better performance by applying the word embedding as the representation of words. In this paper, we propose Deep Word-Topic Latent Dirichlet Allocation(DWT-LDA), a new process for training LDA with word embedding. A neural network with word embedding is applied to the Collapsed Gibbs Sampling process as another choice for word topic assignment. To quantitatively evaluate our model, the topic coherence framework and topic diversity are the metrics used to compare between our approach and the original LDA. The experimental result shows that our method generates more coherent and diverse topics.
科研通智能强力驱动
Strongly Powered by AbleSci AI