博碩士論文 108423053 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:35 、訪客IP:3.144.3.63
姓名 廖源昱(Yuan-Yu Liao)  查詢紙本館藏   畢業系所 資訊管理學系
論文名稱 構件聚焦多頭共同注意力網路在基於面向的情感分析
(Aspect-based sentiment analysis with component focusing multi-head coattention networks)
相關論文
★ 零售業商業智慧之探討★ 有線電話通話異常偵測系統之建置
★ 資料探勘技術運用於在學成績與學測成果分析 -以高職餐飲管理科為例★ 利用資料採礦技術提昇財富管理效益 -以個案銀行為主
★ 晶圓製造良率模式之評比與分析-以國內某DRAM廠為例★ 商業智慧分析運用於學生成績之研究
★ 運用資料探勘技術建構國小高年級學生學業成就之預測模式★ 應用資料探勘技術建立機車貸款風險評估模式之研究-以A公司為例
★ 績效指標評估研究應用於提升研發設計品質保證★ 基於文字履歷及人格特質應用機械學習改善錄用品質
★ 以關係基因演算法為基礎之一般性架構解決包含限制處理之集合切割問題★ 關聯式資料庫之廣義知識探勘
★ 考量屬性值取得延遲的決策樹建構★ 從序列資料中找尋偏好圖的方法 - 應用於群體排名問題
★ 利用分割式分群演算法找共識群解群體決策問題★ 以新奇的方法有序共識群應用於群體決策問題
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 基於面向的情感分析 (Aspect-based Sentiment Analysis; ABSA) 目的為從文本中預測特定目標的情感極性,過去這類任務的研究大多採用文字嵌入再透過RNN網路進行編碼,近年開始有人使用注意力機制去學習文本和目標之間的關係,但多文字目標和使用平均池化的問題存在這類任務的許多研究當中,本文提出構件聚焦多頭共同注意力網路 (Component Focusing Multi-head Coattention Networks; CF-MCAN) 模型,包含擴展文本、構件聚焦、多頭共同注意力三個模組來改善過去所遇到的問題,擴展文本能夠讓BERT的能力在ABSA任務上得到更好的發揮,構件聚焦讓文本能夠將形容詞及副詞的權重提高,改善過去只使用平均池化,將每個字都視為同等重要的問題,多頭共同注意力網路能夠在學習文本表示前,先學習多文字目標中的重要字詞,並且可以讓序列型資料對序列型資料進行注意力機制,在三個資料集上與過去論文進行比較,我們透過實驗證明提出模型的有效性。
摘要(英) The purpose of Aspect-based Sentiment Analysis (ABSA) is to predict the sentiment polarity of a specific target from the text. In the past, the majority of the related research used word embedding and then encoding through the RNN network. In recent years, some researchers have started to learn the relationship between the context and the target by using attention mechanism, but multi-word targets and the use of average pooling arise some problems in many studies of this type of task. This paper proposes component focusing multi-head coattention networks (CF-MCAN) model which contains three modules: extended context, component focusing, and multi-headed coattention, to improve the problems encountered in the past. The extended context can exert better BERT′s ability in the ABSA task, and the component focusing allows the context to increase the weight of adjectives and adverbs, improving the problem of using average pooling to treat every word as an equally important issue. The multi-head coattention network can learn the important words in the multi-word target before learning the context representation, and can make the sequence data perform the attention mechanism on the sequence data. Comparing three data sets with past papers, our research proves the effectiveness of the proposed model through experiments.
關鍵字(中) ★ 深度學習
★ 神經網路
★ 情感分析
★ BERT
關鍵字(英)
論文目次 摘要 i
ABSTRACT ii
List of Figures v
List of Tables vi
1. Introduction 1
1-1 Contextual word embedding does not highlight the target information 2
1-2 Multi-word target issues and ignoring too much information 3
1-3 Context representation does not focus on important sentiment words 7
1-4 Contribution 8
2. Literature review 10
2-1 Literature review 10
2-1-1 Hand-crafted features 10
2-1-2 Word embedding 10
2-1-3 BERT 16
2-2 Research background 19
2-2-1 Component focusing 19
2-2-2 Coattention 20
2-2-3 Multi-head attention 20
3. Methodology 21
3-1 Task definition and notation 21
3-2 An overview of CF-MCAN 21
3-3 Extraction layer 22
3-3-1 Component focusing 22
3-3-2 Extended context 23
3-4 Embedding layer 23
3-4 Multi-head coattention layer 26
3-4-1 Multi-head attention (MHA) 26
3-4-2 Coattention 27
3-5 Sentiment classifier 28
4. Experiments 29
4-1 Dataset 29
4-2 Evaluation metric and parameters 30
4-3 Model comparisons 31
4-4 Experimental result 32
5. Conclusion and future work 35
References 36
參考文獻 [1] Liu, B., Sentiment analysis and opinion mining. Synthesis lectures on human language technologies, 2012. 5(1): p. 1-167.
[2] Pontiki, M., et al. SemEval-2014 Task 4: Aspect Based Sentiment Analysis. 2014. Dublin, Ireland: Association for Computational Linguistics.
[3] Hai, Z., K. Chang, and J.-j. Kim. Implicit feature identification via co-occurrence association rule mining. in International Conference on Intelligent Text Processing and Computational Linguistics. 2011. Springer.
[4] Yu, J., et al. Aspect ranking: identifying important product aspects from online consumer reviews. in Proceedings of the 49th annual meeting of the association for computational linguistics: human language technologies. 2011.
[5] Tang, D., et al., Effective LSTMs for target-dependent sentiment classification. arXiv preprint arXiv:1512.01100, 2015.
[6] Peters, M.E., et al., Deep contextualized word representations. arXiv preprint arXiv:1802.05365, 2018.
[7] Devlin, J., et al., Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805, 2018.
[8] Luong, M.-T., H. Pham, and C.D. Manning, Effective approaches to attention-based neural machine translation. arXiv preprint arXiv:1508.04025, 2015.
[9] Zhou, J., et al., Deep learning for aspect-level sentiment classification: Survey, vision, and challenges. IEEE access, 2019. 7: p. 78454-78483.
[10] Wang, Y., et al. Attention-based LSTM for aspect-level sentiment classification. in Proceedings of the 2016 conference on empirical methods in natural language processing. 2016.
[11] Ma, D., et al., Interactive attention networks for aspect-level sentiment classification. arXiv preprint arXiv:1709.00893, 2017.
[12] Yang, C., et al., Aspect-based sentiment analysis with alternating coattention networks. Information Processing & Management, 2019. 56(3): p. 463-478.
[13] Song, Y., et al., Attentional encoder network for targeted sentiment classification. arXiv preprint arXiv:1902.09314, 2019.
[14] Vaswani, A., et al., Attention is all you need. arXiv preprint arXiv:1706.03762, 2017.
[15] Zeng, J., X. Ma, and K. Zhou, Enhancing attention-based LSTM with position context for aspect-level sentiment classification. IEEE Access, 2019. 7: p. 20462-20471.
[16] Li, Z., et al. Exploiting coarse-to-fine task transfer for aspect-level sentiment classification. in Proceedings of the AAAI Conference on Artificial Intelligence. 2019.
[17] Yin, X., et al., Improving sentence representations via component focusing. Applied Sciences, 2020. 10(3): p. 958.
[18] Wagner, J., et al., Dcu: Aspect-based polarity classification for semeval task 4. 2014.
[19] Kiritchenko, S., et al. Nrc-canada-2014: Detecting aspects and sentiment in customer reviews. in Proceedings of the 8th international workshop on semantic evaluation (SemEval 2014). 2014.
[20] Weston, J., S. Chopra, and A. Bordes, Memory networks. arXiv preprint arXiv:1410.3916, 2014.
[21] Tang, D., B. Qin, and T. Liu, Aspect level sentiment classification with deep memory network. arXiv preprint arXiv:1605.08900, 2016.
[22] Chen, P., et al. Recurrent attention network on memory for aspect sentiment analysis. in Proceedings of the 2017 conference on empirical methods in natural language processing. 2017.
[23] Radford, A., et al., Improving language understanding by generative pre-training. 2018.
[24] Gao, Z., et al., Target-dependent sentiment classification with BERT. IEEE Access, 2019. 7: p. 154290-154299.
[25] Dong, L., et al. Adaptive recursive neural network for target-dependent twitter sentiment classification. in Proceedings of the 52nd annual meeting of the association for computational linguistics (volume 2: Short papers). 2014.
指導教授 陳彥良 審核日期 2021-6-29
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明