完整後設資料紀錄
DC 欄位語言
dc.contributor.authorChien, Jen-Tzungen_US
dc.contributor.authorLee, Chao-Hsien_US
dc.contributor.authorTan, Zheng-Huaen_US
dc.date.accessioned2017-04-21T06:49:22Z-
dc.date.available2017-04-21T06:49:22Z-
dc.date.issued2016en_US
dc.identifier.isbn978-1-5090-0746-2en_US
dc.identifier.issn2161-0363en_US
dc.identifier.urihttp://hdl.handle.net/11536/134552-
dc.description.abstractThe topic model based on latent Dirichlet allocation relies on the prior statistics of topic proportionals for multinomial words. The words in a document are modeled as a random mixture of latent topics which are drawn from a single Dirichlet prior. However, a single Dirichlet distribution may not sufficiently characterize the variations of topic proportionals estimated from the heterogeneous documents. To deal with this concern, we present a Dirichlet mixture allocation (DMA) model which learns latent topics and their proportionals for topic and document clustering by using the prior based on a Dirichlet mixture model. Multiple Dirichlets pave a way to capture the structure of latent variables in learning representation from real-world documents covering a variety of topics. This paper builds a new latent variable model and develops a variational Bayesian inference procedure to learn model parameters consisting of mixture weights, Dirichlet parameters and word multinomials. Experiments on document representation show the merit of the proposed structural learning by increasing the number of Dirichlets in a DMA topic model.en_US
dc.language.isoen_USen_US
dc.subjectBayesian learningen_US
dc.subjecttopic modelen_US
dc.subjectstructural learningen_US
dc.subjectDirichlet mixture modelen_US
dc.titleDIRICHLET MIXTURE ALLOCATIONen_US
dc.typeProceedings Paperen_US
dc.identifier.journal2016 IEEE 26TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP)en_US
dc.contributor.department電機工程學系zh_TW
dc.contributor.departmentDepartment of Electrical and Computer Engineeringen_US
dc.identifier.wosnumberWOS:000392177200058en_US
dc.citation.woscount0en_US
顯示於類別:會議論文