標題: Rapid Bayesian Learning for Recurrent Neural Network Language Model
作者: Chien, Jen-Tzung
Ku, Yuan-Chu
Huang, Mou-Yue
電機資訊學士班
Undergraduate Honors Program of Electrical Engineering and Computer Science
關鍵字: Hessian matrix;Bayesian learning;Recurrent neural network language model;speech recognition
公開日期: 1-Jan-2014
摘要: This paper presents Bayesian learning for recurrent neural network language model (RNN-LM). Our goal is to regularize the RNN-LM by compensating for the randomness of the estimated model parameters which is characterized by a Gaussian prior. This model is not only constructed by training the synaptic weight parameters according to the maximum a posteriori criterion but also regularized by estimating the Gaussian hyperparameter through the type 2 maximum likelihood. However, a critical issue in Bayesian RNN-LM is the heavy computation of Hessian matrix which is formed as the sum of a large amount of outer-products of high-dimensional gradient vectors. We present a rapid approximation to reduce the redundancy due to the curse of dimensionality and speed up the calculation by summing up only the salient outer-products. Experiments on 1B-Word Benchmark, Penn Treebank and World Street Journal corpora show that rapid Bayesian RNN-LM consistently improves the perplexity and word error rate in comparison with standard RNN-LM.
URI: http://hdl.handle.net/11536/125004
ISBN: 978-1-4799-4219-0
ISSN: 
期刊: 2014 9TH INTERNATIONAL SYMPOSIUM ON CHINESE SPOKEN LANGUAGE PROCESSING (ISCSLP)
起始頁: 34
結束頁: 38
Appears in Collections:Conferences Paper