English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 78818/78818 (100%)
造訪人次 : 34758518      線上人數 : 922
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋


    請使用永久網址來引用或連結此文件: http://ir.lib.ncu.edu.tw/handle/987654321/89829


    題名: 混合式階層多輪客製化對話生成模型;Hybrid Hierarchical Transformer for Customized Multi-turn Dialogue Generation
    作者: 黃智輝;Huang, Chih-Hui
    貢獻者: 資訊管理學系
    關鍵詞: 對話生成;自注意力機制;客製化系統;深度學習;Dialogue Generation;Self-attention mechanism;Customized System;Deep Learning
    日期: 2022-07-21
    上傳時間: 2022-10-04 12:01:19 (UTC+8)
    出版者: 國立中央大學
    摘要: 自然語言處理技術(Nature Language Processing)現今已獲得了長足的進步,從過往的小單位詞彙的翻譯,到如今已能對整篇文章進行整合性理解並了解句子內的含意。目前最常見的對話生成採用多對多模型(Seq2Seq),將使用者的一句問題輸入進模型後模型根據該句話來生成最好的回答。然而,觀察在現實生活中複雜的人與人對話,將很難發現對話採用獨立的單句問與答。大多數人對話會使用多輪對話的形式,也就是當回答者想要回答時,他所關注的問題不單單只有當下提問者的提問,更會包含前幾輪提問者的發問、自己前幾輪的回答。同時,對話的生成應該更加客製化,根據當前對話的主題、提問者的對象特徵的不同,模型應該有不一樣的對話來滿足不同客戶、不同對話主題。
    本實驗針對以上不足之處做出改進,提出使用混和式階層注意力機制來改良多輪對話的訊息。以外也提出了在自注意力基礎上如何客製化生成語句,透過實驗證明此作法能夠有效改善此類任務條件,並為對話生成帶來貢獻。
    ;Nowadays, Natural Language Processing (NLP) has been made great progress. In the past, NLP can only work with one-word translation, nowadays, it could integrate the entire article and understand the meaning of sentence. The most common dialogue generation technique is “sequence to sequence”, which generates the best response sentence according to a single user input. However, in the reality, most of human dialogue have multi-turn questions and responses instead of only a single pair of them. When a person wants to response decently, he or she will not only focus on the last question, but the whole scenario, includes previous conversations. That is to say, the generation of dialogue may be more completely if contains information of utterance. Second, in the meanwhile, the response should be more customized. It should be based on current dialogue theme, such as characteristics of the questioner…etc. Model should have different responses for different users and themes although within same questions.
    To meet above achievements, our research proposes hybrid hierarchical mechanism to improve multi-turn dialogues. Furthermore, we also propose a method to customize generating response based on self-attention mechanism. In our experiments, this approach can effectively improve the dialogue generation.
    顯示於類別:[資訊管理研究所] 博碩士論文

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    index.html0KbHTML85檢視/開啟


    在NCUIR中所有的資料項目都受到原著作權保護.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明