中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/81262
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 80990/80990 (100%)
Visitors : 41663527      Online Users : 1666
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/81262


    Title: 利用 Attentive 來改善端對端中文語篇剖析遞迴類 神經網路系統;Using Attentive to improve Recursive LSTM End-to- End Chinese Discourse Parsing
    Authors: 王育任;Wang, Yu-Jen
    Contributors: 資訊工程學系
    Keywords: 深度學習;篇章剖析;注意力機制;遞迴類神經網路;Deep Learning;Discourse Parsing;Attention;Recursive neural network
    Date: 2019-08-08
    Issue Date: 2019-09-03 15:41:10 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 篇章剖析,可以幫助我們以不同角度來理解文句之間的關係與連結,但篇章剖析的資 料結構目前仰賴人工標記,使得這項技術無法直接被利用在任意篇章中。因此至目前為 止,有許多研究著手於讓電腦能夠自動對篇章進行剖析,並建立出一個完整的剖析樹。以 中文語料庫 CDTB 來說,欲建立完整的篇章剖析程式,其問題主要可以被分成四個,分 別是子句分割、剖析樹建立、子句關係辨識、中心關係辨識。
    由於深度學習近幾年發展快速,因此針對篇章剖析的建構方法也從傳統的 SVM, CRF 等,進展到目前已遞迴類神經方式來建構剖析篇章。而在本篇論文中,我們也加入了許多 目前最新的深度學習技術,例如 Attentive RvNN、self-attentive、BERT 等方法,來提高模 型的準確度。
    最後,我們成功將每一項任務的 F1 都提高了近 10% 左右,達到目前我們所知研究中 最好的效能。;Discourse parser can help us to understand the relationship and connection between sentences from different angles, but the tree structure data still need to rely on manual marking, which makes this technology cannot be directly used in life. So far, there have been many research studies on automatically construct the complete tree structure on the computer. Since deep learning has progressed rapidly in recent years, the construction method for discourse parser has also changed from the traditional SVM, CRF method to the current recursive neural.
    In the Chinese corpus tree library CDTB, the parsing analysis problem can be divided into four main problems, including elementary discourse unit (EDU) segmentation, tree structure construction, center labeling, and sense labeling. In this paper, we use many state-of-the-art deep learning techniques, such as attentive recursive neural networks, self-attentive, and BERT to improve the performance.
    In the end, we succeed to increase the accuracy by more than 10% of F1 of each task, reaching the best performance we know so far.
    Appears in Collections:[Graduate Institute of Computer Science and Information Engineering] Electronic Thesis & Dissertation

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML150View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明