English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 78852/78852 (100%)
造訪人次 : 37819598      線上人數 : 590
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋


    請使用永久網址來引用或連結此文件: http://ir.lib.ncu.edu.tw/handle/987654321/92747


    題名: TFMNN:基於TF-M在MCUs上的可信神經網路推理;TFMNN:Trusted Neural Network Inference using TF-M on MCUs
    作者: 葉庭愷;Yeh, Ting-Kai
    貢獻者: 資訊工程學系
    關鍵詞: 邊緣AI;AI安全;神經網路;微控制器單元;可信執行環境;TrustedFirmware-M;Edge AI;AI Security;Neural Network;Microcontroller;Trusted Execution Environment;TrustedFirmware-M
    日期: 2023-08-08
    上傳時間: 2023-10-04 16:09:55 (UTC+8)
    出版者: 國立中央大學
    摘要: 在當今的物聯網中,部署在微控制器上的神經網路被廣泛應用,從智能家電到機械臂和電動車,應用範圍非常廣泛。然而,部署在微控制器上的神經網路面臨一些重要的安全挑戰,尤其是篡改和隱私攻擊風險。本文提出了一個專為微控制器設計的可信神經網路框架,即TFMNN。TFMNN使用Arm TrustedFirmware-M,為微控制器提供了一個可信執行環境來隔離隱私操作和重要軟體元件的環境。微控制器通常具有有限的計算資源和有限的記憶體容量。因此,將神經網路運行在微控制器上面臨處理計算資源不足和記憶體限制的挑戰。此外,實施安全措施通常需要導入額外的機制,可能會影響MCU的計算和記憶體開銷。TFMNN不僅在可接受的開銷下維持推理安全,還優化神經網路推理的安全記憶體的使用。對於部署在微控制器上的神經網路,模型更新通常是必要的,例如在引入新數據進行學習和性能優化時。傳統上,要在設備上更新模型可能需要重新刷寫韌體,這非常耗時。因此,TFMNN提供了一個安全的模型存儲方式,使得模型提供者能夠輕鬆更新模型。總之,TFMNN作為一個專為微控制器設計的可信神經網路框架,有效地解決了神經網路在微控制器上面臨的安全挑戰。通過分析和討論在實際微控制器應用中的開銷,我們證明了TFMNN的可行性。;In today′s IoT, NN (Neural Networks) on MCUs (Microcontrollers) are widely used ranging from smart home appliances, to robotic arms, electric vehicles. However, neural networks on MCUs face some important security challenges, especially the risk of tampering and privacy attacks. This paper provides a trusted NN framework, TFMNN, on MCUs. TFMNN uses Arm TF-M (TrustedFirmware-M) which provides a TEE (Trusted Execution Environment) for MCUs to isolate the environment for sensitive operations and critical software components. MCUs typically have restricted computing resources and limited memory capacity. Consequently, running NN on MCUs presents the challenges of dealing with insufficient computing power and memory constraints. In addition, implementing security measures often necessitates the incorporation of additional mechanisms, which can potentially impact the computational and memory overhead of the MCU. TFMNN not only maintains inference maintains inference security under acceptable overhead but also optimizes the secure memory usage of neural network inference. For NN deployed on MCUs, model updates are typically necessary, such as when incorporating new data for learning and performance optimization. Traditionally, updating a model on the device may require firmware reflashing, which can be time-consuming and cause interruptions. Therefore, TFMNN offers a secure model storage which makes it easy for model providers to update models. In summary, TFMNN, as a trusted NN framework specially designed for MCUs, effectively solves the security challenges faced by NN on MCUs. Through analyzing and discussing the overhead in real-world MCUs applications, we demonstrate the feasibility of TFMNN.
    顯示於類別:[資訊工程研究所] 博碩士論文

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    index.html0KbHTML48檢視/開啟


    在NCUIR中所有的資料項目都受到原著作權保護.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明