中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/93421
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 80990/80990 (100%)
Visitors : 41266061      Online Users : 127
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/93421


    Title: TFMNN:基於TF-M在MCUs上的可信神經網路推理;TFMNN:Trusted Neural Network Inference using TF-M on MCUs
    Authors: 葉庭愷;Yeh, Ting-Kai
    Contributors: 資訊工程學系
    Keywords: 邊緣AI;AI安全;神經網路;微控制器單元;可信執行環境;TrustedFirmware-M;Edge AI;AI Security;Neural Network;Microcontroller;Trusted Execution Environment;TrustedFirmware-M
    Date: 2023-08-08
    Issue Date: 2024-09-19 17:01:32 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 在當今的物聯網中,部署在微控制器上的神經網路被廣泛應用,從智能家電到機械臂和電動車,應用範圍非常廣泛。然而,部署在微控制器上的神經網路面臨一些重要的安全挑戰,尤其是篡改和隱私攻擊風險。本文提出了一個專為微控制器設計的可信神經網路框架,即TFMNN。TFMNN使用Arm TrustedFirmware-M,為微控制器提供了一個可信執行環境來隔離隱私操作和重要軟體元件的環境。微控制器通常具有有限的計算資源和有限的記憶體容量。因此,將神經網路運行在微控制器上面臨處理計算資源不足和記憶體限制的挑戰。此外,實施安全措施通常需要導入額外的機制,可能會影響MCU的計算和記憶體開銷。TFMNN不僅在可接受的開銷下維持推理安全,還優化神經網路推理的安全記憶體的使用。對於部署在微控制器上的神經網路,模型更新通常是必要的,例如在引入新數據進行學習和性能優化時。傳統上,要在設備上更新模型可能需要重新刷寫韌體,這非常耗時。因此,TFMNN提供了一個安全的模型存儲方式,使得模型提供者能夠輕鬆更新模型。總之,TFMNN作為一個專為微控制器設計的可信神經網路框架,有效地解決了神經網路在微控制器上面臨的安全挑戰。通過分析和討論在實際微控制器應用中的開銷,我們證明了TFMNN的可行性。;In today′s IoT, NN (Neural Networks) on MCUs (Microcontrollers) are widely used ranging from smart home appliances, to robotic arms, electric vehicles. However, neural networks on MCUs face some important security challenges, especially the risk of tampering and privacy attacks. This paper provides a trusted NN framework, TFMNN, on MCUs. TFMNN uses Arm TF-M (TrustedFirmware-M) which provides a TEE (Trusted Execution Environment) for MCUs to isolate the environment for sensitive operations and critical software components. MCUs typically have restricted computing resources and limited memory capacity. Consequently, running NN on MCUs presents the challenges of dealing with insufficient computing power and memory constraints. In addition, implementing security measures often necessitates the incorporation of additional mechanisms, which can potentially impact the computational and memory overhead of the MCU. TFMNN not only maintains inference maintains inference security under acceptable overhead but also optimizes the secure memory usage of neural network inference. For NN deployed on MCUs, model updates are typically necessary, such as when incorporating new data for learning and performance optimization. Traditionally, updating a model on the device may require firmware reflashing, which can be time-consuming and cause interruptions. Therefore, TFMNN offers a secure model storage which makes it easy for model providers to update models. In summary, TFMNN, as a trusted NN framework specially designed for MCUs, effectively solves the security challenges faced by NN on MCUs. Through analyzing and discussing the overhead in real-world MCUs applications, we demonstrate the feasibility of TFMNN.
    Appears in Collections:[Graduate Institute of Computer Science and Information Engineering] Electronic Thesis & Dissertation

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML9View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明