標題: SUPPORTIVE ATTENTION IN END-TO-END MEMORY NETWORKS
作者: Chien, Jen-Tzung
Lin, Ting-An
電機工程學系
Department of Electrical and Computer Engineering
關鍵字: Deep learning;end-to-end learning;memory network;attention mechanism;sequential learning
公開日期: 1-一月-2018
摘要: Attention over natural language aims to spotlight on the meaningful region with representative keywords which can extract desirable meanings to accomplish a task of interest. The attention parameter is a latent variable which was indirectly estimated by minimizing the classification loss. For the task of question answering (QA), the classification loss may not sufficiently reflect the target answer. This paper proposes a direct solution which attends the meaningful region by minimizing the reconstruction loss due to auxiliary or supporting data which are available in different scenarios. In particular, minimizing the classification and reconstruction losses are carried out under the end-to-end memory network so that the memory-augmented question answering is realized. Such a supportive attention is implemented as a sequence-tosequence model which reconstructs the supporting sentence to assure the translation invariance. The merit of this method is sufficiently demonstrated for sequential learning by using the bAbI QA and dialog tasks.
URI: http://hdl.handle.net/11536/150838
ISSN: 2161-0363
期刊: 2018 IEEE 28TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP)
顯示於類別:會議論文