中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/93251
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 80990/80990 (100%)
Visitors : 41652364      Online Users : 1664
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/93251


    Title: Incorporating Word Ordering Information into Contrastive Learning of Sentence Embeddings
    Authors: 李彥瑾;Lee, Yan-Jin
    Contributors: 資訊管理學系
    Keywords: 對比學習;句子嵌入;自然語言理解;深度學習;contrastive learning;sentence embeddings;natural language understanding;deep neural network
    Date: 2023-07-24
    Issue Date: 2024-09-19 16:50:44 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 句子嵌入 (Sentence Embeddings) 在自然語言理解(Natural Language Understanding, NLU)扮演重要的角色。預訓練模型 (例如: BERT、RoBERTa) 會將原始句子轉換成句子嵌入,並將該嵌入 (Embeddings) 套用在多個NLU任務中有顯著的效果提升,但在語意文本相似性(Semantic Textual, Similarity, STS)任務上表現卻不如預期。先前的研究發現BERT和RoBERTa模型對單詞順序具有敏感性。為此,我們提出了一種名為StructCSE的方法,它將學習單詞順序 (word order) 和語意訊息 (semantic information) 相結合,以增強對比學習的句子嵌入。
    在實驗中,我們分別使用了語意文本相似性 (STS) 以及遷移學習 (Transfer Learning) 任務來驗證 StructCSE的有效性。根據實驗結果,在大部分的資料集StructCSE有與過去作法相抗衡或優於基線模型的表現,其中發現StructCSE在以BERT為基礎模型時,在STS任務有顯著的進步,而遷移學習至情感分析的子任務中,StructCSE有突出的表現。
    ;Sentence embeddings play an important role in Natural language understanding (NLU). Pretrained models like BERT and RoBERTa encode input sentences into embeddings, which significantly enhances performance across multiple NLU tasks. However, they underperform in the task of semantic textual similarity (STS). Previous research has found that BERT and RoBERTa are sensitive to the word ordering. In response, we propose StructCSE, a method that incorporates word order and semantic information to enhance contrastive learning of sentence embeddings.
    In our experiments, we evaluate the effectiveness of StructCSE using STS and transfer learning tasks. The results demonstrate that StructCSE performs competitively or outperforms baseline models on most datasets. Especially, with fine-tuning BERT, StructCSE achieves better performance in the STS tasks and exhibits outstanding performance in the sentiment analysis subtasks of transfer learning.
    Appears in Collections:[Graduate Institute of Information Management] Electronic Thesis & Dissertation

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML14View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明