中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/95401
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 80990/80990 (100%)
Visitors : 41084969      Online Users : 660
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/95401


    Title: 研究基於知識蒸餾與交叉注意力機制之跨領域推薦系統;A Study of Cross-Domain Recommendation Based on Knowledge Distillation and Cross-Attention
    Authors: 林方瑜;Lin, Fang-Yu
    Contributors: 資訊管理學系
    Keywords: 跨領域推薦系統;交叉注意力機制;知識蒸餾;資料稀疏;冷啟動問題;Cross-domain Recommendation System;Cross-attention;Knowledge Distillation;Data Sparsity;Cold Start Issue
    Date: 2024-03-14
    Issue Date: 2024-10-09 16:46:34 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 隨著電子商務的發展,線上購物已經成為現代生活中不可或缺的一部份,而推薦系 統在這之中扮演著相當重要的角色,然而傳統的推薦系統在資料量較少的領域,容易面 臨到「資料稀疏(Data Sparsity)」與「冷啟動(Cold Start)」問題,跨領域推薦系統能夠很 好的改善這個問題,本研究提出兩個跨領域推薦模型,皆使用自注意機制(Self-attention) 來動態捕捉使用者在不同領域的偏好,並分別使用交叉注意力機制(Cross-attention)與知 識蒸餾(Knowledge Distillation)來將知識從來源領域移轉到目標領域,以改善傳統推薦系 統在目標領域上遇到問題,提升目標領域的推薦效果,最後使用來自現實世界的 Amazon 數據集進行評估,實驗結果顯示與其他跨領域推薦系統相比,我們研究中提出的兩個模 型,在三個指標與兩組資料集上平均提升了 8.49%與 4.78%。;Recent years have seen a burgeoning of recommender systems and application of that recommender systems to E-commerce. Online shopping has become an indispensable part of modern life. The recommender system plays a crucial role in E-commerce. However, the main difficulty surrounding traditional recommender systems is the data sparsity and cold start issues in the domain with fewer data. In this study, we propose two different cross-domain recommendations, both of which utilize Self-attention to dynamically capture user preferences in different domains and implement Cross-attention and Knowledge Distillation to transfer knowledge from the source domain to the target domain respectively. The major purpose of our models is to improve the problem encountered by the traditional recommendation and enhance the effect of recommendation in the target domain. Last but not least, we utilize the real-world Amazon dataset to evaluate our two models. The experimental results show a striking improvement of our models on two datasets and three evaluation matrices in comparison to other state-of-the-art recommendation models.
    Appears in Collections:[Graduate Institute of Information Management] Electronic Thesis & Dissertation

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML19View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明