博碩士論文 101522098 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:20 、訪客IP:3.136.97.64
姓名 吳倢瑩(Chieh-ying Wu)  查詢紙本館藏   畢業系所 資訊工程學系
論文名稱 基於Kinect的跌倒偵測與行為監控系統
(A Kinect-based Fall Detection and Activity Monitoring System)
相關論文
★ 以Q-學習法為基礎之群體智慧演算法及其應用★ 發展遲緩兒童之復健系統研製
★ 從認知風格角度比較教師評量與同儕互評之差異:從英語寫作到遊戲製作★ 基於檢驗數值的糖尿病腎病變預測模型
★ 模糊類神經網路為架構之遙測影像分類器設計★ 複合式群聚演算法
★ 身心障礙者輔具之研製★ 指紋分類器之研究
★ 背光影像補償及色彩減量之研究★ 類神經網路於營利事業所得稅選案之應用
★ 一個新的線上學習系統及其於稅務選案上之應用★ 人眼追蹤系統及其於人機介面之應用
★ 結合群體智慧與自我組織映射圖的資料視覺化研究★ 追瞳系統之研發於身障者之人機介面應用
★ 以類免疫系統為基礎之線上學習類神經模糊系統及其應用★ 基因演算法於語音聲紋解攪拌之應用
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 由於獨居老人與醫院的病患經常處在跌倒後延遲救助的巨大風險之中,因此近年來有關跌倒偵測系統的研究有蓬勃的發展。本論文的目標是開發一套基於Kinect的跌倒偵測與行為監控系統,替獨居老人提供更安全的生活環境。系統捨棄骨架追蹤資訊,取Kinect的原始深度資訊來做使用。起初,系統會先偵測出原始地面資訊,然後,藉由一個簡單的動態背景相減演算法,從目前所偵測到的地面資訊與原先的地面資訊間的差異中找出前景資訊;當一個具有一定體積與高度的前景物出現,系統會將其視為獨居老人的候選目標,追蹤並分析此移動物體。系統將老人的日常行為用決策樹劃分為五個主要種類:站、走、坐、躺、蹲。當一個跌倒事件被偵測到時,系統會立即發出警示給照顧者。此外,系統會自動生成跌倒事件紀錄 (如:時間、地點、如何跌倒等),提供更多有意義的資訊給醫療照護者。除了跌倒偵測功能之外,本篇論文所提出的系統亦可提供日常的行為資訊(如:躺在床上的時段、進入廁所的時段、行走的時段等)以供後續分析使用。
實驗的設計上分別設計三個情境,情境一是測試在只有正常活動下是否會有誤判情形發生;情境二是測試是否在沒有環境變動下能正確偵測跌倒事件;情境三是測試是否在有環境的變動之下能正確偵測跌倒事件。在偵測跌倒的實驗中共有90次的跌倒事件,其精確率(Precision)約為94%,召回率(Recall)約為96%。
摘要(英) Older individuals living alone at home or wards of a hospital are usually at great risk of delayed assistance following a fall; therefore, the research of fall detection systems has been greatly growing in these years. This thesis aims to develop a Kinect-based fall detection and activity monitoring system to provide more safety for older individuals living alone. The raw data from the Kinect depth images are processed directly rather than the skeletal tracking information. The system starts from the detection of the ground plane and then a simple dynamic background subtraction algorithm is used to identify foreground pixels from the changes between the currently detected ground plane and the original background ground plane. A foreground object with at least a minimum size and height is considered to be a candidate of the older individual and then this moving object will be tracked and analyzed. Decision trees are adopted to divide the daily activities of the older individual into five major types: standing, walking, sitting, lying, and squatting. The system will issue a warning signal to caregivers whenever a fall event is detected. In addition, the system will automatically generate a fall events record (e.g., when, where and how the fall happened, etc) which provides much valuable information for the health care providers. In addition to the detection of falls, the proposed system can also provide information about the daily activities (e.g., the time period of lying in a bed, time period of entering a lavatory, time period of walking, etc.) for the analysis purpose.

The performance of the proposed system was verified by three experimental scenarios. The first experimental scenario is designed to test whether false alarms would happen under normal daily movements. The second experimental scenario is designed to test whether falls could be correctly detected under no change of environments. The third experimental scenario is designed to test whether falls could be correctly detected if there are some changes in environments. Among 90 fall events, the precision ratio and the recall ratio were 94% and 96%, respectively.
關鍵字(中) ★ 跌倒偵測
★ Kinect
★ 行為辨識
★ 生活環境輔具
★ 影像式監控
關鍵字(英) ★ fall detection
★ Kinect
★ behavior recognition
★ ambient-assisted living tools
★ video surveillance
論文目次 ABSTRACT III
致謝 V
目錄 VI
圖目錄 IX
表目錄 XIII
一、 緒論 1
1-1 研究動機 1
1-2 研究目的 2
1-3 論文架構 3
二、 相關研究 4
2-1 跌倒的重要性 4
2-1-1 老人居家安全 4
2-1-2 病房安全 5
2-1-3 現況 6
2-1-4 跌倒模式 10
2-2 人物行為分析 14
2-2-1 系統架構 15
2-2-2 特徵表示 16
2-2-3 行為分析 20
三、 跌倒偵測系統 23
3-1 系統架構 23
3-1-1 硬體介紹 24
3-1-2 軟體架構 25
3-2 環境模組 27
3-2-1 空間模型 28
3-2-2 標記物模型 30
3-3 行為監控模組 39
3-3-1 目標物偵測 40
3-3-2 追蹤與定位 48
3-3-3 特徵擷取 49
3-3-4 狀態定義與判斷 57
3-3-5 跌倒判斷 62
3-4 環境更新 64
3-5 使用者介面 68
3-5-1 行為紀錄 69
3-5-2 語音提示介面 70
四、 實驗設計與結果 72
4-1 實驗設計 72
4-2 行為分析實驗 74
4-2-1 狀態判斷結果 76
4-2-2 跌倒偵測與辨識結果 79
4-3 環境光源影響實驗 83
4-4 環境建立誤差 85
五、 結論與未來展望 90
5-1 結論 90
5-2 未來展望 91
參考文獻 92
參考文獻 [1] "台北市政府衛生局."
[2] 鄭以晨, 曾雅梅, and 簡戊鑑, "台灣 2009 年 65 歲以上老人跌墜傷患住院醫療利用及影響因子之探討," 臺灣老人保健學刊, vol. 7, pp. 55-71, 2011.
[3] "台灣病人安全通報系統."
[4] 陳玉枝, 林麗華, and 簡淑芬, "住院病患傷害性跌倒的影響因素與其醫療資源耗用之相關性," 志為護理-慈濟護理雜誌, vol. 1, pp. 66-77, 2002.
[5] 莊蕙琿, 黃焜煌, 王素美, and 劉穗蘭, "住院病患跌倒事件分析-以某區域教學醫院為例," 澄清醫護管理雜誌, vol. 4, pp. 23-28, 2008.
[6] T. Degen, H. Jaeckel, M. Rufer, and S. Wyss, "SPEEDY: A Fall Detector in a Wrist Watch," in ISWC, 2003, pp. 184-189.
[7] A. Diaz, M. Prado, L. Roa, J. Reina-Tosina, and G. Sanchez, "Preliminary evaluation of a full-time falling monitor for the elderly," in Engineering in Medicine and Biology Society, 2004. IEMBS′04. 26th Annual International Conference of the IEEE, 2004, pp. 2180-2183.
[8] G. Brown, "An accelerometer based fall detector: development, experimentation, and analysis," University of California, Berkeley, 2005.
[9] T. R. Hansen, J. M. Eklund, J. Sprinkle, R. Bajcsy, and S. Sastry, "Using smart sensors and a camera phone to detect and verify the fall of elderly persons," in European Medicine, Biology and Engineering Conference, 2005.
[10] C. Marzahl, P. Penndorf, I. Bruder, and M. Staemmler, "Unobtrusive fall detection using 3D images of a gaming console: Concept and first results," in Ambient Assisted Living, ed: Springer, 2012, pp. 135-146.
[11] (2011). Tunstall: Sturzdetektion. Available: http://www.hausnotruf-shop.de/Tunstall-Piper-FallDetector.
[12] "Sen Cit + monitors 2011."
[13] G. Wu and S. Xue, "Portable preimpact fall detector with inertial sensors," Neural Systems and Rehabilitation Engineering, IEEE Transactions on, vol. 16, pp. 178-183, 2008.
[14] M. Gövercin, J. Spehr, S. Winkelbach, E. Steinhagen-Thiessen, and F. Wahl, "Visual fall detection system in home environments," Gerontechnology, vol. 7, p. 114, 2008.
[15] "signaKom: Sturzmatte," 2011.
[16] "BMBF Projekt SensFloor."
[17] "Future Shape: SensFloor Fußboden."
[18] H. Nait-Charif and S. J. McKenna, "Activity summarisation and fall detection in a supportive home environment," in Pattern Recognition, 2004. ICPR 2004. Proceedings of the 17th International Conference on, 2004, pp. 323-326.
[19] C. Rougier, J. Meunier, A. St-Arnaud, and J. Rousseau, "Fall detection from human shape and motion history using video surveillance," in Advanced Information Networking and Applications Workshops, 2007, AINAW′07. 21st International Conference on, 2007, pp. 875-880.
[20] B. U. Töreyin, Y. Dedeoğlu, and A. E. Çetin, "HMM based falling person detection using both audio and video," in Computer Vision in Human-Computer Interaction, ed: Springer, 2005, pp. 211-220.
[21] M.-L. Wang, C.-C. Huang, and H.-Y. Lin, "An intelligent surveillance system based on an omnidirectional vision sensor," in Cybernetics and Intelligent Systems, 2006 IEEE Conference on, 2006, pp. 1-6.
[22] C.-L. Chen, "智慧型影像監控老人跌倒偵測," 臺北大學資通科技產業碩士專班學位論文, 2013.
[23] . "Microsoft Kinect.."
[24] B. Jansen, F. Temmermans, and R. Deklerck, "3D human pose recognition for home monitoring of elderly," in Engineering in Medicine and Biology Society, 2007. EMBS 2007. 29th Annual International Conference of the IEEE, 2007, pp. 4049-4051.
[25] G. Diraco, A. Leone, and P. Siciliano, "An active vision system for fall detection and posture recognition in elderly healthcare," in Design, Automation & Test in Europe Conference & Exhibition (DATE), 2010, 2010, pp. 1536-1541.
[26] C. Rougier, E. Auvinet, J. Rousseau, M. Mignotte, and J. Meunier, "Fall detection from depth map video sequences," in Toward Useful Services for Elderly and People with Disabilities, ed: Springer, 2011, pp. 121-128.
[27] G. Mastorakis and D. Makris, "Fall detection system using Kinect’s infrared sensor," Journal of Real-Time Image Processing, pp. 1-12, 2012.
[28] E. Stone and M. Skubic, "Fall detection in homes of older adults using the Microsoft Kinect," 2014.
[29] H.-h. Liu, "Using Kinect Do the Detection of Indoor Falls," National Central University, Taiwan, 2012.
[30] R. Planinc and M. Kampel, "Introducing the use of depth data for fall detection," Personal and ubiquitous computing, vol. 17, pp. 1063-1072, 2013.
[31] N. Noury, P. Rumeau, A. Bourke, G. OLaighin, and J. Lundy, "A proposal for the classification and evaluation of fall detectors," Irbm, vol. 29, pp. 340-349, 2008.
[32] J. W. Davis and A. F. Bobick, "The representation and recognition of human movement using temporal templates," in Computer Vision and Pattern Recognition, 1997. Proceedings., 1997 IEEE Computer Society Conference on, 1997, pp. 928-934.
[33] D. Weinland, R. Ronfard, and E. Boyer, "A survey of vision-based methods for action representation, segmentation and recognition," Computer Vision and Image Understanding, vol. 115, pp. 224-241, 2011.
[34] M. Cristani, R. Raghavendra, A. Del Bue, and V. Murino, "Human behavior analysis in video surveillance: A social signal processing perspective," Neurocomputing, vol. 100, pp. 86-97, 2013.
[35] J. Zhang, X.-b. MAO, and T.-j. CHEN, "Survey of moving object tracking algorithm [J]," Application Research of Computers, vol. 12, p. 001, 2009.
[36] S. Brahnam and L. Nanni, "High Performance Set of Features for Human Action Classification," in IPCV, 2009, pp. 980-984.
[37] K. Onishi, T. Takiguchi, and Y. Ariki, "3D human posture estimation using the HOG features from monocular image," in Pattern Recognition, 2008. ICPR 2008. 19th International Conference on, 2008, pp. 1-4.
[38] R. Polana and R. Nelson, "Low level recognition of human motion (or how to get your man without finding his body parts)," in Motion of Non-Rigid and Articulated Objects, 1994., Proceedings of the 1994 IEEE Workshop on, 1994, pp. 77-82.
[39] H. Zhang and L. E. Parker, "4-dimensional local spatio-temporal features for human activity recognition," in Intelligent Robots and Systems (IROS), 2011 IEEE/RSJ International Conference on, 2011, pp. 2044-2049.
[40] S. Gumhold, X. Wang, and R. MacLeod, "Feature extraction from point clouds," in Proceedings of 10th international meshing roundtable, 2001.
[41] K. Tran, I. A. Kakadiaris, and S. K. Shah, "Fusion of human posture features for continuous action recognition," in Trends and Topics in Computer Vision, ed: Springer, 2012, pp. 244-257.
[42] L. Rabiner, "A tutorial on hidden Markov models and selected applications in speech recognition," Proceedings of the IEEE, vol. 77, pp. 257-286, 1989.
[43] S. Carlsson and J. Sullivan, "Action recognition by shape matching to key frames," in Workshop on Models versus Exemplars in Computer Vision, 2001, p. 18.
[44] K. Murphy, "An introduction to graphical models," A Brief Introduction to Graphical Models and Bayesian Networks, vol. 10, 2001.
[45] J. Aggarwal and M. S. Ryoo, "Human activity analysis: A review," ACM Computing Surveys (CSUR), vol. 43, p. 16, 2011.
[46] L. Bao and S. S. Intille, "Activity recognition from user-annotated acceleration data," in Pervasive computing, ed: Springer, 2004, pp. 1-17.
[47] E. M. Tapia, S. S. Intille, and K. Larson, Activity recognition in the home using simple and ubiquitous sensors: Springer, 2004.
[48] B. Logan, J. Healey, M. Philipose, E. M. Tapia, and S. Intille, A long-term evaluation of sensing modalities for activity recognition: Springer, 2007.
[49] T. Huỳnh, U. Blanke, and B. Schiele, "Scalable recognition of daily activities with wearable sensors," in Location-and context-awareness, ed: Springer, 2007, pp. 50-67.
[50] L. Zhao, X. Wang, G. Sukthankar, and R. Sukthankar, "Motif discovery and feature selection for crf-based activity recognition," in Pattern Recognition (ICPR), 2010 20th International Conference on, 2010, pp. 3826-3829.
[51] T. van Kasteren and B. Krose, "Bayesian activity recognition in residence for elders," 2007.
[52] P. Rashidi and D. J. Cook, "Mining sensor streams for discovering human activity patterns over time," in Data Mining (ICDM), 2010 IEEE 10th International Conference on, 2010, pp. 431-440.
[53] D. Lymberopoulos, A. Bamis, and A. Savvides, "Extracting spatiotemporal human activity patterns in assisted living using a home sensor network," Universal Access in the Information Society, vol. 10, pp. 125-138, 2011.
[54] T. Gu, Z. Wu, X. Tao, H. K. Pung, and J. Lu, "epsicar: An emerging patterns based approach to sequential, interleaved and concurrent activity recognition," in Pervasive Computing and Communications, 2009. PerCom 2009. IEEE International Conference on, 2009, pp. 1-9.
[55] P. Rashidi and A. Mihailidis, "A survey on ambient-assisted living tools for older adults," IEEE journal of biomedical and health informatics, vol. 17, pp. 579-590, 2013.
[56] W.-c. Lin, "An Imaged-based Navigation System for the Blind," 2012.
[57] Vitruvian Man. Available: http://englishclass.jp/reading/topic/Vitruvian_Man
指導教授 蘇木春(Mu-chun Su) 審核日期 2014-8-7
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明