English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 80990/80990 (100%)
造訪人次 : 41635271      線上人數 : 1368
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋


    請使用永久網址來引用或連結此文件: http://ir.lib.ncu.edu.tw/handle/987654321/86637


    題名: 結合Selective Mechanism與多向注意力機制應用於自動文本摘要之研究
    作者: 董子瑄;Tung, Tzu-Hsuan
    貢獻者: 資訊管理學系
    關鍵詞: Transformer;Selective Mechanism;自注意力機制;萃取式摘要;中文文本摘要;Transformer;Selective mechanism;Self-attention;Abstractive summarization;Chinese summarization
    日期: 2021-08-02
    上傳時間: 2021-12-07 13:02:48 (UTC+8)
    出版者: 國立中央大學
    摘要: 文本摘要任務的目的在於將原始文本以精簡的文字重新呈現,同時要保留重點且不失原文語意。本研究結合Selective Mechanism與Transformer模型中的多向注意力機制以提升萃取式摘要模型的生成摘要品質,透過一個可訓練的Selective Gate Network對Transformer編碼器的多向注意力輸出進行過濾,產生二次潛在語意向量,以達到精煉的效果,其目的在於以過濾的方式,除去次要的資訊,萃取出應保留在摘要中的重點資訊,並使用二次潛在語意向量進行解碼,來產生更好的摘要。
    本研究並將此模型應用於中文文本摘要生成上,以ROUGE值做為評估指標,實驗結果顯示此模型在ROUGE-1、ROUGE-2、ROUGE-L都能超越Baseline模型,在Word-based ROUGE上提升約7.3~12.7%,在Character-based ROUGE上提升約4.9~7.9%,此外搭配Word to Character的斷詞方法並擴大編碼器更可以大幅提升各項ROUGE指標,在Word-based ROUGE可再提升20.4~41.8%,Character-based ROUGE可再提升約21.5~31.1%。;Text summarization task aims to represent the original article in condensed text, while retaining the key points and the original semantics. This research combines selective mechanism with multi-head attention to improve the generated summary quality of the abstractive summarization model. A trainable selective gate network is used to filter the multi-head attention outputs in the Transformer encoder, which can select important information and discard unimportant information, and finally construct second level representation. The second level representation is a tailored sentence representation, which can be decoded into a better summary.
    This model is applied to Chinese text summarization task, and the evaluation metric is ROUGE score. The experiment result shows that the model performance exceed the baseline by 7.3 to 12.7% on word-based ROUGE, and 4.9 to 7.9% on character-based ROUGE. Moreover, with word to character tokenization and larger vocabulary banks can significantly improve the performance. In word-based ROUGE, it can increase by 20.4 to 41.8%, and character-based ROUGE can increase by 21.5 to 31.1%.
    顯示於類別:[資訊管理研究所] 博碩士論文

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    index.html0KbHTML81檢視/開啟


    在NCUIR中所有的資料項目都受到原著作權保護.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明