DC 欄位 |
值 |
語言 |
DC.contributor | 資訊管理學系 | zh_TW |
DC.creator | 陳俞琇 | zh_TW |
DC.creator | Yu-Xiu Chen | en_US |
dc.date.accessioned | 2019-7-19T07:39:07Z | |
dc.date.available | 2019-7-19T07:39:07Z | |
dc.date.issued | 2019 | |
dc.identifier.uri | http://ir.lib.ncu.edu.tw:444/thesis/view_etd.asp?URN=106423011 | |
dc.contributor.department | 資訊管理學系 | zh_TW |
DC.description | 國立中央大學 | zh_TW |
DC.description | National Central University | en_US |
dc.description.abstract | 自然語言處理的模型由於需要準備字典給模型做挑選,因此衍生出 Out Of Vocabulary(OOV) 這個問題,是指句子裡面有不存在於字典的用詞,過往有人嘗試在 Recurrent Neural Networks(RNN) 上加入複製機制,以改善這個問題。但 Transformer 是自然語言處理的新模型,不若過往的 RNN 或 Convolutional Neural Networks(CNN) 已經有許多改善機制,因此本研究將針對 Transformer 進行改良,添加額外的輸入和輸 出的相互注意力,來完成複製機制的概念,讓 Transformer 能有更佳的表現。 | zh_TW |
dc.description.abstract | In natural language processing, Out of Vocabulary(OOV) has always been a issue. It limits the performance of summarization model. Past study resolve this problem by adding copy mechanism to Recurrent Neural Networks(RNN). However, resent study discover a new model – Transformer which outperforms RNN in many categories. So, this work will improve Transformer model by adding copy mechanism in order to enhance the relation of model’s input and output result.. | en_US |
DC.subject | 自然語言處理 | zh_TW |
DC.subject | 萃取式摘要 | zh_TW |
DC.subject | 注意力機制 | zh_TW |
DC.subject | Transformer | zh_TW |
DC.subject | 複製機制 | zh_TW |
DC.title | 具擷取及萃取能力的摘要模型 | zh_TW |
dc.language.iso | zh-TW | zh-TW |
DC.type | 博碩士論文 | zh_TW |
DC.type | thesis | en_US |
DC.publisher | National Central University | en_US |