Title: THE SHARED DIRICHLET PRIORS FOR BAYESIAN LANGUAGE MODELING
Authors: Chien, Jen-Tzung
電機學院
College of Electrical and Computer Engineering
Keywords: Bayesian learning;language model;model smoothing;optimal hyperparameter
Issue Date: 2015
Abstract: We present a new full Bayesian approach for language modeling based on the shared Dirichlet priors. This model is constructed by introducing the Dirichlet distribution to represent the uncertainty of n-gram parameters in training phase as well as in test time. Given a set of training data, the marginal likelihood over n-gram probabilities is illustrated in a form of linearly-interpolated n-grams. The hyperparameters in Dirichlet distributions are interpreted as the prior backoff information which is shared for the group of n-gram histories. This study estimates the shared hyperparameters by maximizing the marginal distribution of n-gram given the training data. Such Bayesian language model is connected to the smoothed language model. Experimental results show the superiority of the proposed method to the other methods in terms of perplexity and word error rate.
URI: http://hdl.handle.net/11536/135715
ISBN: 978-1-4673-6997-8
ISSN: 1520-6149
Journal: 2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP)
Begin Page: 2081
End Page: 2085
Appears in Collections:Conferences Paper