完整後設資料紀錄
DC 欄位語言
dc.contributor.authorChien, Jen-Tzungen_US
dc.date.accessioned2019-04-02T06:04:46Z-
dc.date.available2019-04-02T06:04:46Z-
dc.date.issued2015-01-01en_US
dc.identifier.issn1520-6149en_US
dc.identifier.urihttp://hdl.handle.net/11536/150694-
dc.description.abstractWe present a new full Bayesian approach for language modeling based on the shared Dirichlet priors. This model is constructed by introducing the Dirichlet distribution to represent the uncertainty of n-gram parameters in training phase as well as in test time. Given a set of training data, the marginal likelihood over n-gram probabilities is illustrated in a form of linearly-interpolated n-grams. The hyperparameters in Dirichlet distributions are interpreted as the prior backoff information which is shared for the group of n-gram histories. This study estimates the shared hyperparameters by maximizing the marginal distribution of n-gram given the training data. Such Bayesian language model is connected to the smoothed language model. Experimental results show the superiority of the proposed method to the other methods in terms of perplexity and word error rate.en_US
dc.language.isoen_USen_US
dc.subjectBayesian learningen_US
dc.subjectlanguage modelen_US
dc.subjectmodel smoothingen_US
dc.subjectoptimal hyperparameteren_US
dc.titleTHE SHARED DIRICHLET PRIORS FOR BAYESIAN LANGUAGE MODELINGen_US
dc.typeProceedings Paperen_US
dc.identifier.journal2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP)en_US
dc.citation.spage2081en_US
dc.citation.epage2085en_US
dc.contributor.department電機工程學系zh_TW
dc.contributor.departmentDepartment of Electrical and Computer Engineeringen_US
dc.identifier.wosnumberWOS:000427402902038en_US
dc.citation.woscount1en_US
顯示於類別:會議論文