中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/93226
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 80990/80990 (100%)
Visitors : 41650650      Online Users : 1396
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/93226


    Title: History Aware Multi-Stage Prompting for Neural Chat Translation
    Authors: 林佑錡;Lin, Yu-Chi
    Contributors: 資訊管理學系
    Keywords: 神經網路聊天翻譯;機器翻譯;提示調整;深度學習;neural chat translation;machine translation;prompt tuning;deep learning
    Date: 2023-07-20
    Issue Date: 2024-09-19 16:49:43 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 神經網路聊天翻譯 (Neural Chat Translation, NCT) 是近年於機器翻譯領域中興起的任務,與神經機器翻譯 (Neural Machine Translation, NMT) 不同的是,神經網路聊天翻譯還涉及了多輪對話,因此是一項相當具挑戰性的二合一任務。雖然先前已經有研究使用上下文感知模型,並加入不同的輔助任務來來解決此任務,但往往需要很高的訓練成本。
    在微調預訓練語言的成本逐漸提升下,提示 (Prompt) 調整的開始興起,該方法展現了具備參數效率以及在表現上可與微調預訓練語言比較的特性。而最近此方法有被應用至機器翻譯領域中,但是仍只考慮句子級別的翻譯,沒辦法有效將神經網路聊天翻譯任務重視的聊天內容納入考量。因此在本研究中,我們為這項任務提出一個新的提示調整方法稱為 History Aware Multi-stage Prompting (HAMSP),透過將聊天歷史內容資訊納入到提示,以引導預訓練語言模型生成與對話情境一致的翻譯結果。
    在實驗結果中,我們展示了我們提出的 HAMSP 與基準方法相較之下達到更好的表現性能,並且能夠與微調方法相互抗衡。而透過進一步的內在評估,我們說明了我們的方法更加的穩健,並且能夠有效提升翻譯結果的對話連貫性,以及可以提升訓練效率與降低硬體成本,具備廣泛應用至真實世界中不同的聊天系統之潛力。
    ;Neural Chat Translation (NCT) is an emerging task in the field of machine translation. Unlike Neural Machine Translation (NMT), NCT involves multi-turn conversations, making it a challenging dual-task. Previous research has explored the use of context-aware models and auxiliary tasks to address this task, but often at a high training cost.
    As the cost of fine-tuning pre-trained language models continues to rise, prompt tuning has emerged as a promising alternative. This method demonstrates the characteristics of parameter efficiency and comparable performance to fine-tuning pre-trained language models. Recently, this method has been applied to the field of machine translation, but it only considers sentence-level translations and does not incorporate the conversational content that is crucial in neural chat translation tasks. Therefore, in this study, we present a new prompt tuning method called History Aware Multi-Stage Prompting (HAMSP). By incorporating the information from the chat history into the prompts, we guide the pre-trained language model to generate translations that are consistent with the conversational context.
    In the experimental results, we demonstrate that our proposed HAMSP outperforms the baseline methods and can compete with fine-tuning methods. Through further intrinsic evaluation, we illustrate the robustness of our method and its ability to enhance the dialogue coherence of translations. Additionally, our method shows potential for improving training efficiency and reducing hardware costs, making it suitable for various chat systems in real-world applications.
    Appears in Collections:[Graduate Institute of Information Management] Electronic Thesis & Dissertation

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML18View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明