English  |  正體中文  |  简体中文  |  Items with full text/Total items : 66984/66984 (100%)
Visitors : 23063800      Online Users : 304
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/81285


    Title: 具擷取及萃取能力的摘要模型
    Authors: 陳俞琇;Chen, Yu-Xiu
    Contributors: 資訊管理學系
    Keywords: 自然語言處理;萃取式摘要;注意力機制;Transformer;複製機制
    Date: 2019-07-19
    Issue Date: 2019-09-03 15:42:40 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 自然語言處理的模型由於需要準備字典給模型做挑選,因此衍生出 Out Of Vocabulary(OOV) 這個問題,是指句子裡面有不存在於字典的用詞,過往有人嘗試在 Recurrent Neural Networks(RNN) 上加入複製機制,以改善這個問題。但 Transformer 是自然語言處理的新模型,不若過往的 RNN 或 Convolutional Neural Networks(CNN) 已經有許多改善機制,因此本研究將針對 Transformer 進行改良,添加額外的輸入和輸 出的相互注意力,來完成複製機制的概念,讓 Transformer 能有更佳的表現。;In natural language processing, Out of Vocabulary(OOV) has always been a issue. It limits the performance of summarization model. Past study resolve this problem by adding copy mechanism to Recurrent Neural Networks(RNN). However, resent study discover a new model – Transformer which outperforms RNN in many categories. So, this work will improve Transformer model by adding copy mechanism in order to enhance the relation of model’s input and output result..
    Appears in Collections:[資訊管理研究所] 博碩士論文

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML16View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback  - 隱私權政策聲明