Full metadata record
DC FieldValueLanguage
dc.contributor.authorChien, Jen-Tzungen_US
dc.contributor.authorKuo, Che-Yuen_US
dc.date.accessioned2020-07-01T05:21:48Z-
dc.date.available2020-07-01T05:21:48Z-
dc.date.issued2019-01-01en_US
dc.identifier.isbn978-1-7281-0306-8en_US
dc.identifier.urihttp://hdl.handle.net/11536/154482-
dc.description.abstractRecurrent neural network (RNN) has achieved a great success in language modeling where the temporal information based on deterministic state is continuously extracted and evolved through time. Such a simple deterministic transition function using input-to-hidden and hidden-to-hidden weights is usually insufficient to reflect the diversities and variations of latent variable structure behind the heterogeneous natural language. This paper presents a new stochastic Markov RNN (MRNN) to strengthen the learning capability in language model where the trajectory of word sequences is driven by a neural Markov process with Markov state transitions based on a K-state long short-term memory model. A latent state machine is constructed to characterize the complicated semantics in the structured lexical patterns. Gumbel-softmax is introduced to implement the stochastic backpropatation algorithm with discrete states. The parallel computation for rapid realization of MRNN is presented. The variational Bayesian learning procedure is implemented. Experiments demonstrate the merits of stochastic and diverse representation using MRNN language model where the overhead of parameters and computations is limited.en_US
dc.language.isoen_USen_US
dc.subjectlanguage modelen_US
dc.subjectneural Markov processen_US
dc.subjectrecurrent neural networken_US
dc.titleMARKOV RECURRENT NEURAL NETWORK LANGUAGE MODELen_US
dc.typeProceedings Paperen_US
dc.identifier.journal2019 IEEE AUTOMATIC SPEECH RECOGNITION AND UNDERSTANDING WORKSHOP (ASRU 2019)en_US
dc.citation.spage807en_US
dc.citation.epage813en_US
dc.contributor.department電機工程學系zh_TW
dc.contributor.departmentDepartment of Electrical and Computer Engineeringen_US
dc.identifier.wosnumberWOS:000539883100108en_US
dc.citation.woscount0en_US
Appears in Collections:Conferences Paper