Full metadata record
DC FieldValueLanguage
dc.contributor.authorChien, Jen-Tzungen_US
dc.contributor.authorLee, Chao-Hsien_US
dc.contributor.authorTan, Zheng-Huaen_US
dc.date.accessioned2018-08-21T05:53:15Z-
dc.date.available2018-08-21T05:53:15Z-
dc.date.issued2018-02-22en_US
dc.identifier.issn0925-2312en_US
dc.identifier.urihttp://dx.doi.org/10.1016/j.neucom.2017.08.029en_US
dc.identifier.urihttp://hdl.handle.net/11536/144459-
dc.description.abstractText representation based on latent topic model is seen as a non-Gaussian problem where the observed words and latent topics are multinomial variables and the topic proportionals are Dirichlet variables. Traditional topic model is established by introducing a single Dirichlet prior to characterize the topic proportionals. The words in a text document are represented by a random mixture of semantic topics. However, in real world, a single Dirichlet distribution may not faithfully reflect the variations of topic proportionals estimated from the heterogeneous documents. To address these variations, we propose a new latent variable model where latent topics and their proportionals are learned by incorporating the prior based on Dirichlet mixture model. The resulting latent Dirichlet mixture model (LDMM) is constructed for topic clustering as well as document clustering. Multiple Dirichlets provide a solution to build structural latent variables in learning representation over a variety of topics. This study carries out the inference for LDMM according to the variational Bayes and the collapsed variational Bayes. Such an unsupervised LDMM is further extended to a supervised LDMM for text classification. Experiments on document representation, summarization and classification show the merit of structural prior in LDMM topic models. (C) 2017 Elsevier B. V. All rights reserved.en_US
dc.language.isoen_USen_US
dc.subjectBayesian learningen_US
dc.subjectTopic modelen_US
dc.subjectDirichlet mixture modelen_US
dc.titleLatent Dirichlet mixture modelen_US
dc.typeArticleen_US
dc.identifier.doi10.1016/j.neucom.2017.08.029en_US
dc.identifier.journalNEUROCOMPUTINGen_US
dc.citation.volume278en_US
dc.citation.spage12en_US
dc.citation.epage22en_US
dc.contributor.department電機工程學系zh_TW
dc.contributor.departmentDepartment of Electrical and Computer Engineeringen_US
dc.identifier.wosnumberWOS:000423965000003en_US
Appears in Collections:Articles