中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/89829
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 78852/78852 (100%)
Visitors : 35036761      Online Users : 851
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/89829


    Title: 混合式階層多輪客製化對話生成模型;Hybrid Hierarchical Transformer for Customized Multi-turn Dialogue Generation
    Authors: 黃智輝;Huang, Chih-Hui
    Contributors: 資訊管理學系
    Keywords: 對話生成;自注意力機制;客製化系統;深度學習;Dialogue Generation;Self-attention mechanism;Customized System;Deep Learning
    Date: 2022-07-21
    Issue Date: 2022-10-04 12:01:19 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 自然語言處理技術(Nature Language Processing)現今已獲得了長足的進步,從過往的小單位詞彙的翻譯,到如今已能對整篇文章進行整合性理解並了解句子內的含意。目前最常見的對話生成採用多對多模型(Seq2Seq),將使用者的一句問題輸入進模型後模型根據該句話來生成最好的回答。然而,觀察在現實生活中複雜的人與人對話,將很難發現對話採用獨立的單句問與答。大多數人對話會使用多輪對話的形式,也就是當回答者想要回答時,他所關注的問題不單單只有當下提問者的提問,更會包含前幾輪提問者的發問、自己前幾輪的回答。同時,對話的生成應該更加客製化,根據當前對話的主題、提問者的對象特徵的不同,模型應該有不一樣的對話來滿足不同客戶、不同對話主題。
    本實驗針對以上不足之處做出改進,提出使用混和式階層注意力機制來改良多輪對話的訊息。以外也提出了在自注意力基礎上如何客製化生成語句,透過實驗證明此作法能夠有效改善此類任務條件,並為對話生成帶來貢獻。
    ;Nowadays, Natural Language Processing (NLP) has been made great progress. In the past, NLP can only work with one-word translation, nowadays, it could integrate the entire article and understand the meaning of sentence. The most common dialogue generation technique is “sequence to sequence”, which generates the best response sentence according to a single user input. However, in the reality, most of human dialogue have multi-turn questions and responses instead of only a single pair of them. When a person wants to response decently, he or she will not only focus on the last question, but the whole scenario, includes previous conversations. That is to say, the generation of dialogue may be more completely if contains information of utterance. Second, in the meanwhile, the response should be more customized. It should be based on current dialogue theme, such as characteristics of the questioner…etc. Model should have different responses for different users and themes although within same questions.
    To meet above achievements, our research proposes hybrid hierarchical mechanism to improve multi-turn dialogues. Furthermore, we also propose a method to customize generating response based on self-attention mechanism. In our experiments, this approach can effectively improve the dialogue generation.
    Appears in Collections:[Graduate Institute of Information Management] Electronic Thesis & Dissertation

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML90View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明