博碩士論文 111423018 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:94 、訪客IP:3.145.111.115
姓名 朱泳霖(Yung-Lin Chu)  查詢紙本館藏   畢業系所 資訊管理學系
論文名稱
(Elastic-Trust Hybrid Federated Learning)
相關論文
★ 台灣50走勢分析:以多重長短期記憶模型架構為基礎之預測★ 以多重遞迴歸神經網路模型為基礎之黃金價格預測分析
★ 增量學習用於工業4.0瑕疵檢測★ 遞回歸神經網路於電腦零組件銷售價格預測之研究
★ 長短期記憶神經網路於釣魚網站預測之研究★ 基於深度學習辨識跳頻信號之研究
★ Opinion Leader Discovery in Dynamic Social Networks★ 深度學習模型於工業4.0之機台虛擬量測應用
★ A Novel NMF-Based Movie Recommendation with Time Decay★ 以類別為基礎sequence-to-sequence模型之POI旅遊行程推薦
★ A DQN-Based Reinforcement Learning Model for Neural Network Architecture Search★ Neural Network Architecture Optimization Based on Virtual Reward Reinforcement Learning
★ 生成式對抗網路架構搜尋★ 以漸進式基因演算法實現神經網路架構搜尋最佳化
★ Enhanced Model Agnostic Meta Learning with Meta Gradient Memory★ 遞迴類神經網路結合先期工業廢水指標之股價預測研究
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 (2029-7-1以後開放)
摘要(中) 隨著機器學習的蓬勃發展,各組織正在收集大量數據以提高模型性能。然而,隨著對隱私保護的重視,各國政府紛紛立法以保護隱私資料,這無形中增加了組織的數據管理成本。聯邦式學習(FL)的設計初衷是將隱私資料保留在客戶端,減少集中管理敏感數據的風險和負擔。然而,以往的聯邦式學習研究在實踐中遇到了許多挑戰,例如資料異質性、特徵傳輸效率低下以及額外計算量需求等問題,這些都阻礙了聯邦式學習技術的廣泛應用和發展。在我們的新方法中,我們引入了一個包含信任機制和差異聚合策略的兩層聯邦式學習框架(ET-FL)。我們將這一方法應用於多個真實數據集,並驗證了效果。
摘要(英) With the flourishing of machine learning, organizations are gathering vast amounts of data to improve model performance. However, with increasing concerns about data privacy, governments have implemented laws to safeguard private data, thereby raising the cost for organizations. Federated Learning (FL) has been designed to keep private data on clients, reducing the burden of managing sensitive data. Previous research on FL has encountered challenges such as data heterogeneity, feature transmission efficiency, and extra computing power consumption. In our new approach, Elastic-Trust Hybrid Federated Learning (ET-FL), we have introduced a two-layer framework of FL with a Trust mechanism and a differential aggregation strategy. We have applied this methodology to several real datasets and have demonstrated promising experimental results.
關鍵字(中) ★ 聯邦式學習
★ 分散式聯邦式學習
★ 混合式聯邦式學習
關鍵字(英) ★ Federated Learning
★ Decentralized Federated Learning
★ Hybrid Federated Learning
論文目次 摘 要........................................................................................................................................ii
Abstract......................................................................................................................................iii
Table of Contents.......................................................................................................................iv
List of Figures.............................................................................................................................v
List of Tables .............................................................................................................................vi
1. Introduction ........................................................................................................................1
2. Related Works.....................................................................................................................6
2.1 Federated Learning.................................................................................................6
2.2 Decentralized Federated Learning..........................................................................8
3. Methodology..................................................................................................................... 11
3.1 Local Tier..............................................................................................................12
3.2 Global Tier............................................................................................................14
4. Experiments and Evaluation.............................................................................................18
4.1 Baseline and Metrics ............................................................................................19
4.2 Performance Comparison .....................................................................................22
4.3 Trust Weight Influence Analysis...........................................................................25
4.4 Iteration and Round Ratio Analysis......................................................................27
4.5 Ablation Study......................................................................................................28
5 Conclusion........................................................................................................................31
Reference ..................................................................................................................................32
參考文獻 [1] A. Hard, K. Rao, R. Mathews, F. Beaufays, S. Augenstein, H. Eichner, C. Kiddon, and D. Ramage, “Federated Learning for Mobile Keyboard Prediction,” arXiv, vol. abs/1811.03604, 2018.
[2] H. B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. y. Arcas, "Communication-Efficient Learning of Deep Networks from Decentralized Data."
[3] T. Li, A. K. Sahu, M. Sanjabi, M. Zaheer, A. Talwalkar, and V. Smith, “Federated Optimization in Heterogeneous Networks,” arXiv, 2018.
[4] S. P. Karimireddy, S. Kale, M. Mohri, S. J. Reddi, S. U. Stich, and A. T. Suresh, "SCAFFOLD: Stochastic Controlled Averaging for Federated Learning."
[5] J. Wang, Q. Liu, H. Liang, G. Joshi, and H. V. Poor, “Tackling the Objective Inconsistency Problem in Heterogeneous Federated Optimization,” ArXiv, vol. abs/2007.07481, 2020.
[6] D. A. E. Acar, Y. Zhao, R. M. Navarro, M. Mattina, P. N. Whatmough, and V. Saligrama, “Federated Learning Based on Dynamic Regularization,” arXiv, vol. abs/2111.04263, 2021.
[7] M. Duan, D. Liu, X. Chen, Y. Tan, J. Ren, L. Qiao, and L. Liang, "Astraea: SelfBalancing Federated Learning for Improving Classification Accuracy of Mobile Deep Learning Applications."
[8] Y. Zhao, M. Li, L. Lai, N. Suda, D. Civin, and V. Chandra, “Federated Learning with Non-IID Data,” arXiv, vol. abs/1806.00582, 2018.
[9] L. Huang, Y. Yin, Z. F. Zhang, H. Deng, and D. Liu, “LoAdaBoost: loss-based AdaBoost federated machine learning with reduced computational complexity on IID and non-IID intensive care data,” arXiv, vol. abs/1811.12629, 2020.
[10] E. Jeong, S. Oh, H. Kim, J. Park, M. Bennis, and S.-L. Kim, “Communication-Efficient On-Device Machine Learning: Federated Distillation and Augmentation under Non-IID Private Data,” arXiv, vol. abs/1811.11479, 2023.
[11] J. Konecný, H. B. McMahan, F. X. Yu, P. Richtárik, A. T. Suresh, and D. Bacon, “Federated Learning: Strategies for Improving Communication Efficiency,” arXiv, vol. abs/1610.05492, 2016.
[12] S. Caldas, J. Konecný, H. B. McMahan, and A. Talwalkar, “Expanding the Reach of Federated Learning by Reducing Client Resource Requirements,” arXiv, vol. abs/1812.07210, 2018.
[13] F. Sattler, S. Wiedemann, K.-R. Müller, and W. Samek, “Robust and CommunicationEfficient Federated Learning From Non-i.i.d. Data,” IEEE Transactions on Neural Networks and Learning Systems, vol. 31, pp. 3400-3413, 2019.
[14] X. Zhang, M. Hong, S. V. Dhople, W. Yin, and Y. Liu, “FedPD: A Federated Learning Framework with Optimal Rates and Adaptivity to Non-IID Data,” ArXiv, vol. abs/2005.11418, 2020.
[15] T. Nishio, and R. Yonetani, “Client Selection for Federated Learning with Heterogeneous Resources in Mobile Edge,” ICC 2019 - 2019 IEEE International Conference on Communications (ICC), pp. 1-7, 2018.
[16] J. Kang, Z. Xiong, D. T. Niyato, H. Yu, Y.-C. Liang, and D. I. Kim, “Incentive Design for Efficient Federated Learning in Mobile Networks: A Contract Theory Approach,” 2019 IEEE VTS Asia Pacific Wireless Communications Symposium (APWCS), pp. 1-5, 2019.
[17] H. T. Nguyen, V. Sehwag, S. Hosseinalipour, C. G. Brinton, M. Chiang, and H. Vincent Poor, “Fast-Convergent Federated Learning,” IEEE Journal on Selected Areas in Communications, vol. 39, pp. 201-218, 2020.
[18] M. Mohri, G. Sivek, and A. T. Suresh, “Agnostic Federated Learning,” arXiv, vol. abs/1902.00146, 2019.
[19] T. Li, M. Sanjabi, and V. Smith, “Fair Resource Allocation in Federated Learning,” arXiv,vol. abs/1905.10497, 2019.
[20] K. Bonawitz, V. Ivanov, B. Kreuter, A. Marcedone, H. B. McMahan, S. Patel, D. Ramage, A. Segal, and K. Seth, "Practical Secure Aggregation for Privacy-Preserving Machine Learning." pp. 1175–1191.
[21] N. Agarwal, A. T. Suresh, F. Yu, S. Kumar, and H. B. Mcmahan, “cpSGD: Communication-efficient and differentially-private distributed SGD,” arXiv, vol. abs/1805.10559, 2018.
[22] G. Xu, H. Li, S. Liu, K. Yang, and X. Lin, “VerifyNet: Secure and Verifiable Federated Learning,” IEEE Transactions on Information Forensics and Security, vol. 15, pp. 911-926, 2020.
[23] S. J. Reddi, Z. B. Charles, M. Zaheer, Z. Garrett, K. Rush, J. Konecný, S. Kumar, and H. B. McMahan, “Adaptive Federated Optimization,” arXiv, vol. abs/2003.00295, 2020.
[24] T. Li, A. K. Sahu, M. Zaheer, M. Sanjabi, A. Talwalkar, and V. Smith, “FedDANE: A Federated Newton-Type Method,” 2019 53rd Asilomar Conference on Signals, Systems, and Computers, pp. 1227-1231, 2019.
[25] X. Li, K. Huang, W. Yang, S. Wang, and Z. Zhang, “On the Convergence of FedAvg on Non-IID Data,” ArXiv, vol. abs/1907.02189, 2019.
[26] H. Wang, M. Yurochkin, Y. Sun, D. Papailiopoulos, and Y. Khazaeni, “Federated Learning with Matched Averaging,” arXiv, vol. abs/2002.06440, 2020.
[27] E. Jeong, S. Oh, H. Kim, J. Park, M. Bennis, and S.-L. Kim, “Communication-Efficient On-Device Machine Learning: Federated Distillation and Augmentation under Non-IID Private Data,” arXiv, vol. abs/1811.11479, 2018.
[28] D. Li, and J. Wang, “FedMD: Heterogenous Federated Learning via Model Distillation,” arXiv, vol. abs/1910.03581, 2019.
[29] Z. Zhu, J. Hong, and J. Zhou, “Data-Free Knowledge Distillation for Heterogeneous Federated Learning,” in Proceedings of the 38th International Conference on Machine Learning, Proceedings of Machine Learning Research, 2021, pp. 12878--12889.
[30] G. Lee, M. Jeong, Y. Shin, S. Bae, and S.-Y. Yun, “Preservation of the Global Knowledge by Not-True Distillation in Federated Learning,” 2022, pp. 38461--38474.
[31] P. Qi, X. Zhou, Y. Ding, Z. Zhang, S. Zheng, and Z. Li, “FedBKD: Heterogenous Federated Learning via Bidirectional Knowledge Distillation for Modulation Classification in IoT-Edge System,” IEEE Journal of Selected Topics in Signal Processing, vol. 17, pp. 189-204, 2023.
[32] M. G. Arivazhagan, V. Aggarwal, A. K. Singh, and S. Choudhary, “Federated Learning with Personalization Layers,” arXiv, vol. abs/1912.00818, 2019.
[33] A. Fallah, A. Mokhtari, and A. Ozdaglar, “Personalized Federated Learning with Theoretical Guarantees: A Model-Agnostic Meta-Learning Approach,” 2020, pp. 3557--3568.
[34] P. P. Liang, T. Liu, L. Ziyin, R. Salakhutdinov, and L.-P. Morency, “Think Locally, Act Globally: Federated Learning with Local and Global Representations,” ArXiv, vol. abs/2001.01523, 2020.
[35] V. Smith, C.-K. Chiang, M. Sanjabi, and A. Talwalkar, “Federated Multi-Task Learning,” arXiv, vol. abs/1705.10467, 2017.
[36] H. Eichner, T. Koren, H. B. McMahan, N. Srebro, and K. Talwar, “Semi-Cyclic Stochastic Gradient Descent,” arXiv, vol. abs/1904.10120, 2019.
[37] M. Khodak, M.-F. Balcan, and A. Talwalkar, “Adaptive Gradient-Based Meta-Learning Methods,” arXiv, vol. abs/1906.02717, 2019.
[38] Q. Li, B. He, and D. Song, “Model-Contrastive Federated Learning,” arXiv, vol. abs/2103.16257, 2021.
[39] S. Kalra, J. Wen, J. C. Cresswell, M. Volkovs, and H. R. Tizhoosh, “Decentralized federated learning through proxy model sharing,” Nature Communications, vol. 14, 2021.
[40] H. Yue, K. Lanju, L. Qingzhong, and Z. Baochen, “Decentralized Federated Learning Via Mutual Knowledge Distillation,” 2023 IEEE International Conference on Multimedia and Expo (ICME), pp. 342-347, 2023.
[41] A. Gholami, N. Torkzaban, and J. S. Baras, “Trusted Decentralized Federated Learning,” 2022 IEEE 19th Annual Consumer Communications & Networking Conference (CCNC), pp. 1-6, 2022.
[42] Z. Tang, S. Shi, B. Li, and X. Chu, “GossipFL: A Decentralized Federated Learning Framework With Sparsified and Adaptive Communication,” IEEE Transactions on Parallel and Distributed Systems, vol. 34, pp. 909-922, 2023.
[43] N. Masmoudi, and W. Jaafar, “OCD-FL: A Novel Communication-Efficient Peer Selection-based Decentralized Federated Learning,” arXiv, vol. abs/2403.04037, 2024.
[44] C. Hu, J. Jiang, and Z. Wang, “Decentralized Federated Learning: A Segmented Gossip Approach,” arXiv, vol. abs/1908.07782, 2019.
[45] X. Li, B. Chen, and W. Lu, “FedDKD: Federated learning with decentralized knowledge distillation,” Applied Intelligence, pp. 1-17, 2022.
[46] Y. Huang, C. Bert, S. Fischer, M. Schmidt, A. Dörfler, A. Maier, R. Fietkau, and F. Putz, “Continual Learning for Peer-to-Peer Federated Learning: A Study on Automated Brain Metastasis Identification,” arXiv, vol. abs/2204.13591, 2022.
[47] K. Chang, N. Balachandar, C. K. Lam, D. Yi, J. M. Brown, A. L. Beers, B. R. Rosen, D. Rubin, and J. Kalpathy-Cramer, “Distributed deep learning networks among institutions for medical imaging,” Journal of the American Medical Informatics Association : JAMIA, vol. 25, pp. 945 - 954, 2018.
[48] J. Xu, B. S. Glicksberg, C. Su, P. Walker, J. Bian, and F. Wang, “Federated Learning for Healthcare Informatics,” arXiv, vol. abs/1911.06270, 2020.
[49] "The Complete Works of William Shakespeare by William Shakespeare," Project Gutenberg, 1994.
[50] S. Caldas, S. M. K. Duddu, P. Wu, T. Li, J. Konečný, H. B. McMahan, V. Smith, and A. Talwalkar, “LEAF: A Benchmark for Federated Settings,” arXiv, vol. abs/1812.01097, 2018.
[51] Y. Hou, J. Li, Z. He, A. Yan, X. Chen, and J. McAuley, “Bridging Language and Items for Retrieval and Recommendation,” arXiv, vol. abs/2403.03952, 2024.
[52] G. Cohen, S. Afshar, J. Tapson, and A. v. Schaik, “EMNIST: an extension of MNIST to handwritten letters,” arXiv, vol. abs/1702.05373, 2017
指導教授 陳以錚(Yi-Cheng Chen) 審核日期 2024-7-22
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明