博碩士論文 106423011 完整後設資料紀錄

DC 欄位 語言
DC.contributor資訊管理學系zh_TW
DC.creator陳俞琇zh_TW
DC.creatorYu-Xiu Chenen_US
dc.date.accessioned2019-7-19T07:39:07Z
dc.date.available2019-7-19T07:39:07Z
dc.date.issued2019
dc.identifier.urihttp://ir.lib.ncu.edu.tw:444/thesis/view_etd.asp?URN=106423011
dc.contributor.department資訊管理學系zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract自然語言處理的模型由於需要準備字典給模型做挑選,因此衍生出 Out Of Vocabulary(OOV) 這個問題,是指句子裡面有不存在於字典的用詞,過往有人嘗試在 Recurrent Neural Networks(RNN) 上加入複製機制,以改善這個問題。但 Transformer 是自然語言處理的新模型,不若過往的 RNN 或 Convolutional Neural Networks(CNN) 已經有許多改善機制,因此本研究將針對 Transformer 進行改良,添加額外的輸入和輸 出的相互注意力,來完成複製機制的概念,讓 Transformer 能有更佳的表現。zh_TW
dc.description.abstractIn natural language processing, Out of Vocabulary(OOV) has always been a issue. It limits the performance of summarization model. Past study resolve this problem by adding copy mechanism to Recurrent Neural Networks(RNN). However, resent study discover a new model – Transformer which outperforms RNN in many categories. So, this work will improve Transformer model by adding copy mechanism in order to enhance the relation of model’s input and output result..en_US
DC.subject自然語言處理zh_TW
DC.subject萃取式摘要zh_TW
DC.subject注意力機制zh_TW
DC.subjectTransformerzh_TW
DC.subject複製機制zh_TW
DC.title具擷取及萃取能力的摘要模型zh_TW
dc.language.isozh-TWzh-TW
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明