中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/86529
English  |  正體中文  |  简体中文  |  全文笔数/总笔数 : 80990/80990 (100%)
造访人次 : 42141895      在线人数 : 943
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻


    jsp.display-item.identifier=請使用永久網址來引用或連結此文件: http://ir.lib.ncu.edu.tw/handle/987654321/86529


    题名: 構件聚焦多頭共同注意力網路在基於面向的情感分析;Aspect-based sentiment analysis with component focusing multi-head coattention networks
    作者: 廖源昱;Liao, Yuan-Yu
    贡献者: 資訊管理學系
    关键词: 深度學習;神經網路;情感分析;BERT
    日期: 2021-06-29
    上传时间: 2021-12-07 12:56:41 (UTC+8)
    出版者: 國立中央大學
    摘要: 基於面向的情感分析 (Aspect-based Sentiment Analysis; ABSA) 目的為從文本中預測特定目標的情感極性,過去這類任務的研究大多採用文字嵌入再透過RNN網路進行編碼,近年開始有人使用注意力機制去學習文本和目標之間的關係,但多文字目標和使用平均池化的問題存在這類任務的許多研究當中,本文提出構件聚焦多頭共同注意力網路 (Component Focusing Multi-head Coattention Networks; CF-MCAN) 模型,包含擴展文本、構件聚焦、多頭共同注意力三個模組來改善過去所遇到的問題,擴展文本能夠讓BERT的能力在ABSA任務上得到更好的發揮,構件聚焦讓文本能夠將形容詞及副詞的權重提高,改善過去只使用平均池化,將每個字都視為同等重要的問題,多頭共同注意力網路能夠在學習文本表示前,先學習多文字目標中的重要字詞,並且可以讓序列型資料對序列型資料進行注意力機制,在三個資料集上與過去論文進行比較,我們透過實驗證明提出模型的有效性。;The purpose of Aspect-based Sentiment Analysis (ABSA) is to predict the sentiment polarity of a specific target from the text. In the past, the majority of the related research used word embedding and then encoding through the RNN network. In recent years, some researchers have started to learn the relationship between the context and the target by using attention mechanism, but multi-word targets and the use of average pooling arise some problems in many studies of this type of task. This paper proposes component focusing multi-head coattention networks (CF-MCAN) model which contains three modules: extended context, component focusing, and multi-headed coattention, to improve the problems encountered in the past. The extended context can exert better BERT′s ability in the ABSA task, and the component focusing allows the context to increase the weight of adjectives and adverbs, improving the problem of using average pooling to treat every word as an equally important issue. The multi-head coattention network can learn the important words in the multi-word target before learning the context representation, and can make the sequence data perform the attention mechanism on the sequence data. Comparing three data sets with past papers, our research proves the effectiveness of the proposed model through experiments.
    显示于类别:[資訊管理研究所] 博碩士論文

    文件中的档案:

    档案 描述 大小格式浏览次数
    index.html0KbHTML62检视/开启


    在NCUIR中所有的数据项都受到原著作权保护.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明