English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 80990/80990 (100%)
造訪人次 : 41142382      線上人數 : 433
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋


    請使用永久網址來引用或連結此文件: http://ir.lib.ncu.edu.tw/handle/987654321/95731


    題名: 上下文嵌入增強異質圖注意力網路模型 於心理諮詢文本多標籤分類;Contextual Embeddings Enhanced Heterogeneous Graph Attention Networks for Multi-Label Classification of Psychological Counseling Texts
    作者: 曾郁雯;Tzeng, Yu-Wen
    貢獻者: 電機工程學系
    關鍵詞: 多標籤分類;異質圖;圖注意力網路;上下文嵌入;心理諮詢;multi-label text classification;heterogeneous graph;graph attention networks;contextual embeddings;psychological counseling
    日期: 2024-07-26
    上傳時間: 2024-10-09 17:13:27 (UTC+8)
    出版者: 國立中央大學
    摘要: 文本多標籤分類 (Multi-Label Text Classification, MLTC) 任務對於每一則文字內容,預測一個或多個事先給定的分類標籤,由於標籤之間存在隱含關係,難以充分挖掘標籤間的相關性,目前模型效能普遍不佳。本研究探討將圖神經網路 (GNN) 與轉譯器(Transformer) 模型結合,利用異質圖方式建構文本詞彙及標籤之間的關係,並透過網路的圖結構更新能力和轉譯器的自注意力機制,提出了上下文嵌入增強異質圖注意力網路模型 (Contextual Embeddings Enhanced Heterogeneous Graph Attention Networks, CE-HeterGAT),旨在加強文本特徵表示,提升多標籤分類的效能。我們將文本詞彙及標籤透過五種不同的邊建立異質圖,圖節點包括文本詞、標籤詞及一個虛擬節點,邊類型包括字詞之間的序列關係、字詞之間的依存句法關係、字詞與標籤詞之間的語義關係、標籤詞之間的共現關係以及虛擬節點和所有字詞的邊,然後經由圖注意力網路學習異質圖的節點表示。同時文本透過BERT (Bidirectional Encoder Representations from Transformers)得到文本上下文關係,最後將兩種特徵經由我們設計的注意力解碼器得到整個文本的節點表示並預測最後的標籤分類。
    我們建置了兩種標籤分類的中文心理諮詢多標籤文本分類資料集,總共蒐集4,473筆線上心理諮詢的留言,人工標記內容的主題和事件,最終建置完成包含11種主題多標籤的Psycho-MLTopic資料集以及52種事件多標籤的Psycho-MLEvent資料集。藉由實驗與效能評估得知,我們提出的模型CE-HeterGAT效能皆優於其他相關模型(TextCNN、Bi-LSTM、BERT、GCN、GAT、TextGCN、SAT、UGformer、Exphormers),尤其是在Macro-F1 Score指標有顯著的提升,證明異質圖結構及結合上下文訊息的圖神經網路能夠有效提升文本多標籤分類的效能。
    ;Multi-Label Text Classification (MLTC) is a task that focuses on assigning at least one pre-defined label to given texts. Due to the complexity of discovering the implicit relationships among labels, existing methods still need to work on fully exploiting the correlations between labels. This study explores the combination of Graph Neural Networks (GNN) and a Transformer model, using a heterogeneous graph approach to construct relationships between text words and labels. Leveraging the graph processing capabilities of GNNs and the self-attention mechanism of Transformers, we propose the Contextual Embeddings Enhanced Heterogeneous Graph Attention Networks (CE-HeterGAT) model, aimed at enhancing text feature representation and improving multi-label classification performance. We construct a heterogeneous graph comprising of content nodes, label nodes, and a virtual node and five edge types between nodes: 1) sequential relationships between content words; 2) syntactic relationships between content words; 3) semantic relationships between content words and label words; 4) conditional co-occurrence relationships between label words; and 5) edges between the virtual node and all content words. The graph attention networks are then used to learn the node representations of the heterogeneous graph. Simultaneously, the BERT Transformer captures the contextual relationships within the texts. Finally, we use a cross-attention decoder to obtain the fully exploited node representations and predict the final label classifications.
    We collected 4,473 online psychological counseling texts and manually annotated multiple labels, resulting in a Psycho-MLTopic dataset across 11 topic labels and a Psycho-MLEvent dataset across 52 event labels. Experimental results and performance evaluations show that our proposed CE-HeterGAT model outperforms other related models (TextCNN, Bi-LSTM, BERT, GCN, GAT, TextGCN, SAT, UGformer, Exphormers). The CE-HeterGAT model demonstrates significant improvements, especially in the Macro-F1 Score metric, proving that the heterogeneous graph structure combined with contextual information in graph neural networks effectively enhances text classification performance.
    顯示於類別:[電機工程研究所] 博碩士論文

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    index.html0KbHTML16檢視/開啟


    在NCUIR中所有的資料項目都受到原著作權保護.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明