博碩士論文 108521094 完整後設資料紀錄

DC 欄位 語言
DC.contributor電機工程學系zh_TW
DC.creator管祥祐zh_TW
DC.creatorHsiang-Yu Kuanen_US
dc.date.accessioned2021-8-23T07:39:07Z
dc.date.available2021-8-23T07:39:07Z
dc.date.issued2021
dc.identifier.urihttp://ir.lib.ncu.edu.tw:444/thesis/view_etd.asp?URN=108521094
dc.contributor.department電機工程學系zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract本研究使用可穿戴式感測器獲得表示人類的活動識別(HAR)動態的時間序列數據,分析辨識動作與透過機器人表現,進一步開發出一套即時操控格鬥機器人的系統。受試者配戴五顆實驗室自製的慣性感測器,每顆姿態感測器使用一顆九軸慣性感測元件(IMU)組成用來測量角速度、加速度以及地磁強度資訊,經由WiFi無線傳輸進行資料傳輸。量測位置包含四肢及腰部,來獲取受測者全身的運動狀態。本研究請受試者完成11個格鬥動作與靜止動作當本次HAR數據集的動作。我們利用兩種方法來標記數據中的動作區間提供深度學習網路訓練,IMU的訓練資料標記則採用影片的區間做標記、或者是找出動作的起始-終點進行標記,標記過的數據依照不同的感測器通道數與不同大小的window size,擷取不同的特徵當訓練資料,透過CNN、LSTM和CNN + LSTM三種網路辨識動作,比較各網路中的最佳模型和比較其參數量。經實驗驗證,本系統確實能即時辨識受試者動作,而機器人在表現動作上也能順暢的被操控與正確表現動作。zh_TW
dc.description.abstractThis study aims to recognize the dynamic human activity recongnition (HAR) of different postures by wearing a set of whole-body motion sensors. The recognized HAR was appliled to instantly control a battle robot. We used our homemade motion sensors, in which each motion sensor consists of a nine-axis sensors(IMU) to measure nine-axis information, including 3-axis angular velocity, 3-axis acceleration and 3-axis geomagnetic. Five motion sensors were used to acquire subjects’ instant motion information and wirelessly tramintted to remote PC for data processing through WiFi connections. The five motion sensors were attached on subjects’ four limbs and the front side of waist to acquire HAR. Subjects were requrested to complete twelve motions, including eleven fighting motions a resting motion. Subjects were asked to move their bodys to follow the fighting actions shown on a vedio clip. The twelve motions were labled by finding the breaks between two consecutive motion actions or labed by finding the onset-offset points of each motion action. The labed data were analyzed using deep learning networks, and the CNN, LSTM and CNN + LSTM models were compared. Parameters in the neural networks, as well as different sensor channel number and window sizes, were tuned to find the best model structure and parameters. The proposed system has been demonstrated to successfully recognize subjects’ different in the initial onset of each motion action.en_US
DC.subject慣性感測單元zh_TW
DC.subject格鬥機器人zh_TW
DC.subjectCNNzh_TW
DC.subjectInterial motion uniten_US
DC.subjectBattle Roboten_US
DC.subjectCNNen_US
DC.title深度學習網路之姿態感測分析與格鬥機器人控制zh_TW
dc.language.isozh-TWzh-TW
DC.titleBattle robot control using deep learning network based posture detectionen_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明