完整後設資料紀錄
DC 欄位語言
dc.contributor.authorChien, Jen-Tzungen_US
dc.contributor.authorLin, Ting-Anen_US
dc.date.accessioned2019-04-02T06:04:20Z-
dc.date.available2019-04-02T06:04:20Z-
dc.date.issued2018-01-01en_US
dc.identifier.issn2161-0363en_US
dc.identifier.urihttp://hdl.handle.net/11536/150838-
dc.description.abstractAttention over natural language aims to spotlight on the meaningful region with representative keywords which can extract desirable meanings to accomplish a task of interest. The attention parameter is a latent variable which was indirectly estimated by minimizing the classification loss. For the task of question answering (QA), the classification loss may not sufficiently reflect the target answer. This paper proposes a direct solution which attends the meaningful region by minimizing the reconstruction loss due to auxiliary or supporting data which are available in different scenarios. In particular, minimizing the classification and reconstruction losses are carried out under the end-to-end memory network so that the memory-augmented question answering is realized. Such a supportive attention is implemented as a sequence-tosequence model which reconstructs the supporting sentence to assure the translation invariance. The merit of this method is sufficiently demonstrated for sequential learning by using the bAbI QA and dialog tasks.en_US
dc.language.isoen_USen_US
dc.subjectDeep learningen_US
dc.subjectend-to-end learningen_US
dc.subjectmemory networken_US
dc.subjectattention mechanismen_US
dc.subjectsequential learningen_US
dc.titleSUPPORTIVE ATTENTION IN END-TO-END MEMORY NETWORKSen_US
dc.typeProceedings Paperen_US
dc.identifier.journal2018 IEEE 28TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP)en_US
dc.contributor.department電機工程學系zh_TW
dc.contributor.departmentDepartment of Electrical and Computer Engineeringen_US
dc.identifier.wosnumberWOS:000450651000036en_US
dc.citation.woscount0en_US
顯示於類別:會議論文