English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 80990/80990 (100%)
造訪人次 : 41635912      線上人數 : 1110
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋


    請使用永久網址來引用或連結此文件: http://ir.lib.ncu.edu.tw/handle/987654321/93226


    題名: History Aware Multi-Stage Prompting for Neural Chat Translation
    作者: 林佑錡;Lin, Yu-Chi
    貢獻者: 資訊管理學系
    關鍵詞: 神經網路聊天翻譯;機器翻譯;提示調整;深度學習;neural chat translation;machine translation;prompt tuning;deep learning
    日期: 2023-07-20
    上傳時間: 2024-09-19 16:49:43 (UTC+8)
    出版者: 國立中央大學
    摘要: 神經網路聊天翻譯 (Neural Chat Translation, NCT) 是近年於機器翻譯領域中興起的任務,與神經機器翻譯 (Neural Machine Translation, NMT) 不同的是,神經網路聊天翻譯還涉及了多輪對話,因此是一項相當具挑戰性的二合一任務。雖然先前已經有研究使用上下文感知模型,並加入不同的輔助任務來來解決此任務,但往往需要很高的訓練成本。
    在微調預訓練語言的成本逐漸提升下,提示 (Prompt) 調整的開始興起,該方法展現了具備參數效率以及在表現上可與微調預訓練語言比較的特性。而最近此方法有被應用至機器翻譯領域中,但是仍只考慮句子級別的翻譯,沒辦法有效將神經網路聊天翻譯任務重視的聊天內容納入考量。因此在本研究中,我們為這項任務提出一個新的提示調整方法稱為 History Aware Multi-stage Prompting (HAMSP),透過將聊天歷史內容資訊納入到提示,以引導預訓練語言模型生成與對話情境一致的翻譯結果。
    在實驗結果中,我們展示了我們提出的 HAMSP 與基準方法相較之下達到更好的表現性能,並且能夠與微調方法相互抗衡。而透過進一步的內在評估,我們說明了我們的方法更加的穩健,並且能夠有效提升翻譯結果的對話連貫性,以及可以提升訓練效率與降低硬體成本,具備廣泛應用至真實世界中不同的聊天系統之潛力。
    ;Neural Chat Translation (NCT) is an emerging task in the field of machine translation. Unlike Neural Machine Translation (NMT), NCT involves multi-turn conversations, making it a challenging dual-task. Previous research has explored the use of context-aware models and auxiliary tasks to address this task, but often at a high training cost.
    As the cost of fine-tuning pre-trained language models continues to rise, prompt tuning has emerged as a promising alternative. This method demonstrates the characteristics of parameter efficiency and comparable performance to fine-tuning pre-trained language models. Recently, this method has been applied to the field of machine translation, but it only considers sentence-level translations and does not incorporate the conversational content that is crucial in neural chat translation tasks. Therefore, in this study, we present a new prompt tuning method called History Aware Multi-Stage Prompting (HAMSP). By incorporating the information from the chat history into the prompts, we guide the pre-trained language model to generate translations that are consistent with the conversational context.
    In the experimental results, we demonstrate that our proposed HAMSP outperforms the baseline methods and can compete with fine-tuning methods. Through further intrinsic evaluation, we illustrate the robustness of our method and its ability to enhance the dialogue coherence of translations. Additionally, our method shows potential for improving training efficiency and reducing hardware costs, making it suitable for various chat systems in real-world applications.
    顯示於類別:[資訊管理研究所] 博碩士論文

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    index.html0KbHTML15檢視/開啟


    在NCUIR中所有的資料項目都受到原著作權保護.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明