博碩士論文 109423012 完整後設資料紀錄

DC 欄位 語言
DC.contributor資訊管理學系zh_TW
DC.creator黃智輝zh_TW
DC.creatorChih-Hui Huangen_US
dc.date.accessioned2022-7-21T07:39:07Z
dc.date.available2022-7-21T07:39:07Z
dc.date.issued2022
dc.identifier.urihttp://ir.lib.ncu.edu.tw:88/thesis/view_etd.asp?URN=109423012
dc.contributor.department資訊管理學系zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract自然語言處理技術(Nature Language Processing)現今已獲得了長足的進步,從過往的小單位詞彙的翻譯,到如今已能對整篇文章進行整合性理解並了解句子內的含意。目前最常見的對話生成採用多對多模型(Seq2Seq),將使用者的一句問題輸入進模型後模型根據該句話來生成最好的回答。然而,觀察在現實生活中複雜的人與人對話,將很難發現對話採用獨立的單句問與答。大多數人對話會使用多輪對話的形式,也就是當回答者想要回答時,他所關注的問題不單單只有當下提問者的提問,更會包含前幾輪提問者的發問、自己前幾輪的回答。同時,對話的生成應該更加客製化,根據當前對話的主題、提問者的對象特徵的不同,模型應該有不一樣的對話來滿足不同客戶、不同對話主題。 本實驗針對以上不足之處做出改進,提出使用混和式階層注意力機制來改良多輪對話的訊息。以外也提出了在自注意力基礎上如何客製化生成語句,透過實驗證明此作法能夠有效改善此類任務條件,並為對話生成帶來貢獻。zh_TW
dc.description.abstractNowadays, Natural Language Processing (NLP) has been made great progress. In the past, NLP can only work with one-word translation, nowadays, it could integrate the entire article and understand the meaning of sentence. The most common dialogue generation technique is “sequence to sequence”, which generates the best response sentence according to a single user input. However, in the reality, most of human dialogue have multi-turn questions and responses instead of only a single pair of them. When a person wants to response decently, he or she will not only focus on the last question, but the whole scenario, includes previous conversations. That is to say, the generation of dialogue may be more completely if contains information of utterance. Second, in the meanwhile, the response should be more customized. It should be based on current dialogue theme, such as characteristics of the questioner…etc. Model should have different responses for different users and themes although within same questions. To meet above achievements, our research proposes hybrid hierarchical mechanism to improve multi-turn dialogues. Furthermore, we also propose a method to customize generating response based on self-attention mechanism. In our experiments, this approach can effectively improve the dialogue generation.en_US
DC.subject對話生成zh_TW
DC.subject自注意力機制zh_TW
DC.subject客製化系統zh_TW
DC.subject深度學習zh_TW
DC.subjectDialogue Generationen_US
DC.subjectSelf-attention mechanismen_US
DC.subjectCustomized Systemen_US
DC.subjectDeep Learningen_US
DC.title混合式階層多輪客製化對話生成模型zh_TW
dc.language.isozh-TWzh-TW
DC.titleHybrid Hierarchical Transformer for Customized Multi-turn Dialogue Generationen_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明