博碩士論文 111423018 完整後設資料紀錄

DC 欄位 語言
DC.contributor資訊管理學系zh_TW
DC.creator朱泳霖zh_TW
DC.creatorYung-Lin Chuen_US
dc.date.accessioned2024-7-22T07:39:07Z
dc.date.available2024-7-22T07:39:07Z
dc.date.issued2024
dc.identifier.urihttp://ir.lib.ncu.edu.tw:444/thesis/view_etd.asp?URN=111423018
dc.contributor.department資訊管理學系zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract隨著機器學習的蓬勃發展,各組織正在收集大量數據以提高模型性能。然而,隨著對隱私保護的重視,各國政府紛紛立法以保護隱私資料,這無形中增加了組織的數據管理成本。聯邦式學習(FL)的設計初衷是將隱私資料保留在客戶端,減少集中管理敏感數據的風險和負擔。然而,以往的聯邦式學習研究在實踐中遇到了許多挑戰,例如資料異質性、特徵傳輸效率低下以及額外計算量需求等問題,這些都阻礙了聯邦式學習技術的廣泛應用和發展。在我們的新方法中,我們引入了一個包含信任機制和差異聚合策略的兩層聯邦式學習框架(ET-FL)。我們將這一方法應用於多個真實數據集,並驗證了效果。zh_TW
dc.description.abstractWith the flourishing of machine learning, organizations are gathering vast amounts of data to improve model performance. However, with increasing concerns about data privacy, governments have implemented laws to safeguard private data, thereby raising the cost for organizations. Federated Learning (FL) has been designed to keep private data on clients, reducing the burden of managing sensitive data. Previous research on FL has encountered challenges such as data heterogeneity, feature transmission efficiency, and extra computing power consumption. In our new approach, Elastic-Trust Hybrid Federated Learning (ET-FL), we have introduced a two-layer framework of FL with a Trust mechanism and a differential aggregation strategy. We have applied this methodology to several real datasets and have demonstrated promising experimental results.en_US
DC.subject聯邦式學習zh_TW
DC.subject分散式聯邦式學習zh_TW
DC.subject混合式聯邦式學習zh_TW
DC.subjectFederated Learningen_US
DC.subjectDecentralized Federated Learningen_US
DC.subjectHybrid Federated Learningen_US
DC.titleElastic-Trust Hybrid Federated Learningen_US
dc.language.isoen_USen_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明