博碩士論文 110423045 完整後設資料紀錄

DC 欄位 語言
DC.contributor資訊管理學系zh_TW
DC.creator李彥瑾zh_TW
DC.creatorYan-Jin Leeen_US
dc.date.accessioned2023-7-24T07:39:07Z
dc.date.available2023-7-24T07:39:07Z
dc.date.issued2023
dc.identifier.urihttp://ir.lib.ncu.edu.tw:88/thesis/view_etd.asp?URN=110423045
dc.contributor.department資訊管理學系zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract句子嵌入 (Sentence Embeddings) 在自然語言理解(Natural Language Understanding, NLU)扮演重要的角色。預訓練模型 (例如: BERT、RoBERTa) 會將原始句子轉換成句子嵌入,並將該嵌入 (Embeddings) 套用在多個NLU任務中有顯著的效果提升,但在語意文本相似性(Semantic Textual, Similarity, STS)任務上表現卻不如預期。先前的研究發現BERT和RoBERTa模型對單詞順序具有敏感性。為此,我們提出了一種名為StructCSE的方法,它將學習單詞順序 (word order) 和語意訊息 (semantic information) 相結合,以增強對比學習的句子嵌入。 在實驗中,我們分別使用了語意文本相似性 (STS) 以及遷移學習 (Transfer Learning) 任務來驗證 StructCSE的有效性。根據實驗結果,在大部分的資料集StructCSE有與過去作法相抗衡或優於基線模型的表現,其中發現StructCSE在以BERT為基礎模型時,在STS任務有顯著的進步,而遷移學習至情感分析的子任務中,StructCSE有突出的表現。zh_TW
dc.description.abstractSentence embeddings play an important role in Natural language understanding (NLU). Pretrained models like BERT and RoBERTa encode input sentences into embeddings, which significantly enhances performance across multiple NLU tasks. However, they underperform in the task of semantic textual similarity (STS). Previous research has found that BERT and RoBERTa are sensitive to the word ordering. In response, we propose StructCSE, a method that incorporates word order and semantic information to enhance contrastive learning of sentence embeddings. In our experiments, we evaluate the effectiveness of StructCSE using STS and transfer learning tasks. The results demonstrate that StructCSE performs competitively or outperforms baseline models on most datasets. Especially, with fine-tuning BERT, StructCSE achieves better performance in the STS tasks and exhibits outstanding performance in the sentiment analysis subtasks of transfer learning.en_US
DC.subject對比學習zh_TW
DC.subject句子嵌入zh_TW
DC.subject自然語言理解zh_TW
DC.subject深度學習zh_TW
DC.subjectcontrastive learningen_US
DC.subjectsentence embeddingsen_US
DC.subjectnatural language understandingen_US
DC.subjectdeep neural networken_US
DC.titleIncorporating Word Ordering Information into Contrastive Learning of Sentence Embeddingsen_US
dc.language.isoen_USen_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明