博碩士論文 110423068 完整後設資料紀錄

DC 欄位 語言
DC.contributor資訊管理學系zh_TW
DC.creator林佑錡zh_TW
DC.creatorYu-Chi Linen_US
dc.date.accessioned2023-7-20T07:39:07Z
dc.date.available2023-7-20T07:39:07Z
dc.date.issued2023
dc.identifier.urihttp://ir.lib.ncu.edu.tw:88/thesis/view_etd.asp?URN=110423068
dc.contributor.department資訊管理學系zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract神經網路聊天翻譯 (Neural Chat Translation, NCT) 是近年於機器翻譯領域中興起的任務,與神經機器翻譯 (Neural Machine Translation, NMT) 不同的是,神經網路聊天翻譯還涉及了多輪對話,因此是一項相當具挑戰性的二合一任務。雖然先前已經有研究使用上下文感知模型,並加入不同的輔助任務來來解決此任務,但往往需要很高的訓練成本。 在微調預訓練語言的成本逐漸提升下,提示 (Prompt) 調整的開始興起,該方法展現了具備參數效率以及在表現上可與微調預訓練語言比較的特性。而最近此方法有被應用至機器翻譯領域中,但是仍只考慮句子級別的翻譯,沒辦法有效將神經網路聊天翻譯任務重視的聊天內容納入考量。因此在本研究中,我們為這項任務提出一個新的提示調整方法稱為 History Aware Multi-stage Prompting (HAMSP),透過將聊天歷史內容資訊納入到提示,以引導預訓練語言模型生成與對話情境一致的翻譯結果。 在實驗結果中,我們展示了我們提出的 HAMSP 與基準方法相較之下達到更好的表現性能,並且能夠與微調方法相互抗衡。而透過進一步的內在評估,我們說明了我們的方法更加的穩健,並且能夠有效提升翻譯結果的對話連貫性,以及可以提升訓練效率與降低硬體成本,具備廣泛應用至真實世界中不同的聊天系統之潛力。zh_TW
dc.description.abstractNeural Chat Translation (NCT) is an emerging task in the field of machine translation. Unlike Neural Machine Translation (NMT), NCT involves multi-turn conversations, making it a challenging dual-task. Previous research has explored the use of context-aware models and auxiliary tasks to address this task, but often at a high training cost. As the cost of fine-tuning pre-trained language models continues to rise, prompt tuning has emerged as a promising alternative. This method demonstrates the characteristics of parameter efficiency and comparable performance to fine-tuning pre-trained language models. Recently, this method has been applied to the field of machine translation, but it only considers sentence-level translations and does not incorporate the conversational content that is crucial in neural chat translation tasks. Therefore, in this study, we present a new prompt tuning method called History Aware Multi-Stage Prompting (HAMSP). By incorporating the information from the chat history into the prompts, we guide the pre-trained language model to generate translations that are consistent with the conversational context. In the experimental results, we demonstrate that our proposed HAMSP outperforms the baseline methods and can compete with fine-tuning methods. Through further intrinsic evaluation, we illustrate the robustness of our method and its ability to enhance the dialogue coherence of translations. Additionally, our method shows potential for improving training efficiency and reducing hardware costs, making it suitable for various chat systems in real-world applications.en_US
DC.subject神經網路聊天翻譯zh_TW
DC.subject機器翻譯zh_TW
DC.subject提示調整zh_TW
DC.subject深度學習zh_TW
DC.subjectneural chat translationen_US
DC.subjectmachine translationen_US
DC.subjectprompt tuningen_US
DC.subjectdeep learningen_US
DC.titleHistory Aware Multi-Stage Prompting for Neural Chat Translationen_US
dc.language.isoen_USen_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明