完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.author | Chien, Jen-Tzung | en_US |
dc.contributor.author | Lin, Ting-An | en_US |
dc.date.accessioned | 2019-04-02T06:04:20Z | - |
dc.date.available | 2019-04-02T06:04:20Z | - |
dc.date.issued | 2018-01-01 | en_US |
dc.identifier.issn | 2161-0363 | en_US |
dc.identifier.uri | http://hdl.handle.net/11536/150838 | - |
dc.description.abstract | Attention over natural language aims to spotlight on the meaningful region with representative keywords which can extract desirable meanings to accomplish a task of interest. The attention parameter is a latent variable which was indirectly estimated by minimizing the classification loss. For the task of question answering (QA), the classification loss may not sufficiently reflect the target answer. This paper proposes a direct solution which attends the meaningful region by minimizing the reconstruction loss due to auxiliary or supporting data which are available in different scenarios. In particular, minimizing the classification and reconstruction losses are carried out under the end-to-end memory network so that the memory-augmented question answering is realized. Such a supportive attention is implemented as a sequence-tosequence model which reconstructs the supporting sentence to assure the translation invariance. The merit of this method is sufficiently demonstrated for sequential learning by using the bAbI QA and dialog tasks. | en_US |
dc.language.iso | en_US | en_US |
dc.subject | Deep learning | en_US |
dc.subject | end-to-end learning | en_US |
dc.subject | memory network | en_US |
dc.subject | attention mechanism | en_US |
dc.subject | sequential learning | en_US |
dc.title | SUPPORTIVE ATTENTION IN END-TO-END MEMORY NETWORKS | en_US |
dc.type | Proceedings Paper | en_US |
dc.identifier.journal | 2018 IEEE 28TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP) | en_US |
dc.contributor.department | 電機工程學系 | zh_TW |
dc.contributor.department | Department of Electrical and Computer Engineering | en_US |
dc.identifier.wosnumber | WOS:000450651000036 | en_US |
dc.citation.woscount | 0 | en_US |
顯示於類別: | 會議論文 |