參考文獻 |
[1] "台北市政府衛生局."
[2] 鄭以晨, 曾雅梅, and 簡戊鑑, "台灣 2009 年 65 歲以上老人跌墜傷患住院醫療利用及影響因子之探討," 臺灣老人保健學刊, vol. 7, pp. 55-71, 2011.
[3] "台灣病人安全通報系統."
[4] 陳玉枝, 林麗華, and 簡淑芬, "住院病患傷害性跌倒的影響因素與其醫療資源耗用之相關性," 志為護理-慈濟護理雜誌, vol. 1, pp. 66-77, 2002.
[5] 莊蕙琿, 黃焜煌, 王素美, and 劉穗蘭, "住院病患跌倒事件分析-以某區域教學醫院為例," 澄清醫護管理雜誌, vol. 4, pp. 23-28, 2008.
[6] T. Degen, H. Jaeckel, M. Rufer, and S. Wyss, "SPEEDY: A Fall Detector in a Wrist Watch," in ISWC, 2003, pp. 184-189.
[7] A. Diaz, M. Prado, L. Roa, J. Reina-Tosina, and G. Sanchez, "Preliminary evaluation of a full-time falling monitor for the elderly," in Engineering in Medicine and Biology Society, 2004. IEMBS′04. 26th Annual International Conference of the IEEE, 2004, pp. 2180-2183.
[8] G. Brown, "An accelerometer based fall detector: development, experimentation, and analysis," University of California, Berkeley, 2005.
[9] T. R. Hansen, J. M. Eklund, J. Sprinkle, R. Bajcsy, and S. Sastry, "Using smart sensors and a camera phone to detect and verify the fall of elderly persons," in European Medicine, Biology and Engineering Conference, 2005.
[10] C. Marzahl, P. Penndorf, I. Bruder, and M. Staemmler, "Unobtrusive fall detection using 3D images of a gaming console: Concept and first results," in Ambient Assisted Living, ed: Springer, 2012, pp. 135-146.
[11] (2011). Tunstall: Sturzdetektion. Available: http://www.hausnotruf-shop.de/Tunstall-Piper-FallDetector.
[12] "Sen Cit + monitors 2011."
[13] G. Wu and S. Xue, "Portable preimpact fall detector with inertial sensors," Neural Systems and Rehabilitation Engineering, IEEE Transactions on, vol. 16, pp. 178-183, 2008.
[14] M. Gövercin, J. Spehr, S. Winkelbach, E. Steinhagen-Thiessen, and F. Wahl, "Visual fall detection system in home environments," Gerontechnology, vol. 7, p. 114, 2008.
[15] "signaKom: Sturzmatte," 2011.
[16] "BMBF Projekt SensFloor."
[17] "Future Shape: SensFloor Fußboden."
[18] H. Nait-Charif and S. J. McKenna, "Activity summarisation and fall detection in a supportive home environment," in Pattern Recognition, 2004. ICPR 2004. Proceedings of the 17th International Conference on, 2004, pp. 323-326.
[19] C. Rougier, J. Meunier, A. St-Arnaud, and J. Rousseau, "Fall detection from human shape and motion history using video surveillance," in Advanced Information Networking and Applications Workshops, 2007, AINAW′07. 21st International Conference on, 2007, pp. 875-880.
[20] B. U. Töreyin, Y. Dedeoğlu, and A. E. Çetin, "HMM based falling person detection using both audio and video," in Computer Vision in Human-Computer Interaction, ed: Springer, 2005, pp. 211-220.
[21] M.-L. Wang, C.-C. Huang, and H.-Y. Lin, "An intelligent surveillance system based on an omnidirectional vision sensor," in Cybernetics and Intelligent Systems, 2006 IEEE Conference on, 2006, pp. 1-6.
[22] C.-L. Chen, "智慧型影像監控老人跌倒偵測," 臺北大學資通科技產業碩士專班學位論文, 2013.
[23] . "Microsoft Kinect.."
[24] B. Jansen, F. Temmermans, and R. Deklerck, "3D human pose recognition for home monitoring of elderly," in Engineering in Medicine and Biology Society, 2007. EMBS 2007. 29th Annual International Conference of the IEEE, 2007, pp. 4049-4051.
[25] G. Diraco, A. Leone, and P. Siciliano, "An active vision system for fall detection and posture recognition in elderly healthcare," in Design, Automation & Test in Europe Conference & Exhibition (DATE), 2010, 2010, pp. 1536-1541.
[26] C. Rougier, E. Auvinet, J. Rousseau, M. Mignotte, and J. Meunier, "Fall detection from depth map video sequences," in Toward Useful Services for Elderly and People with Disabilities, ed: Springer, 2011, pp. 121-128.
[27] G. Mastorakis and D. Makris, "Fall detection system using Kinect’s infrared sensor," Journal of Real-Time Image Processing, pp. 1-12, 2012.
[28] E. Stone and M. Skubic, "Fall detection in homes of older adults using the Microsoft Kinect," 2014.
[29] H.-h. Liu, "Using Kinect Do the Detection of Indoor Falls," National Central University, Taiwan, 2012.
[30] R. Planinc and M. Kampel, "Introducing the use of depth data for fall detection," Personal and ubiquitous computing, vol. 17, pp. 1063-1072, 2013.
[31] N. Noury, P. Rumeau, A. Bourke, G. OLaighin, and J. Lundy, "A proposal for the classification and evaluation of fall detectors," Irbm, vol. 29, pp. 340-349, 2008.
[32] J. W. Davis and A. F. Bobick, "The representation and recognition of human movement using temporal templates," in Computer Vision and Pattern Recognition, 1997. Proceedings., 1997 IEEE Computer Society Conference on, 1997, pp. 928-934.
[33] D. Weinland, R. Ronfard, and E. Boyer, "A survey of vision-based methods for action representation, segmentation and recognition," Computer Vision and Image Understanding, vol. 115, pp. 224-241, 2011.
[34] M. Cristani, R. Raghavendra, A. Del Bue, and V. Murino, "Human behavior analysis in video surveillance: A social signal processing perspective," Neurocomputing, vol. 100, pp. 86-97, 2013.
[35] J. Zhang, X.-b. MAO, and T.-j. CHEN, "Survey of moving object tracking algorithm [J]," Application Research of Computers, vol. 12, p. 001, 2009.
[36] S. Brahnam and L. Nanni, "High Performance Set of Features for Human Action Classification," in IPCV, 2009, pp. 980-984.
[37] K. Onishi, T. Takiguchi, and Y. Ariki, "3D human posture estimation using the HOG features from monocular image," in Pattern Recognition, 2008. ICPR 2008. 19th International Conference on, 2008, pp. 1-4.
[38] R. Polana and R. Nelson, "Low level recognition of human motion (or how to get your man without finding his body parts)," in Motion of Non-Rigid and Articulated Objects, 1994., Proceedings of the 1994 IEEE Workshop on, 1994, pp. 77-82.
[39] H. Zhang and L. E. Parker, "4-dimensional local spatio-temporal features for human activity recognition," in Intelligent Robots and Systems (IROS), 2011 IEEE/RSJ International Conference on, 2011, pp. 2044-2049.
[40] S. Gumhold, X. Wang, and R. MacLeod, "Feature extraction from point clouds," in Proceedings of 10th international meshing roundtable, 2001.
[41] K. Tran, I. A. Kakadiaris, and S. K. Shah, "Fusion of human posture features for continuous action recognition," in Trends and Topics in Computer Vision, ed: Springer, 2012, pp. 244-257.
[42] L. Rabiner, "A tutorial on hidden Markov models and selected applications in speech recognition," Proceedings of the IEEE, vol. 77, pp. 257-286, 1989.
[43] S. Carlsson and J. Sullivan, "Action recognition by shape matching to key frames," in Workshop on Models versus Exemplars in Computer Vision, 2001, p. 18.
[44] K. Murphy, "An introduction to graphical models," A Brief Introduction to Graphical Models and Bayesian Networks, vol. 10, 2001.
[45] J. Aggarwal and M. S. Ryoo, "Human activity analysis: A review," ACM Computing Surveys (CSUR), vol. 43, p. 16, 2011.
[46] L. Bao and S. S. Intille, "Activity recognition from user-annotated acceleration data," in Pervasive computing, ed: Springer, 2004, pp. 1-17.
[47] E. M. Tapia, S. S. Intille, and K. Larson, Activity recognition in the home using simple and ubiquitous sensors: Springer, 2004.
[48] B. Logan, J. Healey, M. Philipose, E. M. Tapia, and S. Intille, A long-term evaluation of sensing modalities for activity recognition: Springer, 2007.
[49] T. Huỳnh, U. Blanke, and B. Schiele, "Scalable recognition of daily activities with wearable sensors," in Location-and context-awareness, ed: Springer, 2007, pp. 50-67.
[50] L. Zhao, X. Wang, G. Sukthankar, and R. Sukthankar, "Motif discovery and feature selection for crf-based activity recognition," in Pattern Recognition (ICPR), 2010 20th International Conference on, 2010, pp. 3826-3829.
[51] T. van Kasteren and B. Krose, "Bayesian activity recognition in residence for elders," 2007.
[52] P. Rashidi and D. J. Cook, "Mining sensor streams for discovering human activity patterns over time," in Data Mining (ICDM), 2010 IEEE 10th International Conference on, 2010, pp. 431-440.
[53] D. Lymberopoulos, A. Bamis, and A. Savvides, "Extracting spatiotemporal human activity patterns in assisted living using a home sensor network," Universal Access in the Information Society, vol. 10, pp. 125-138, 2011.
[54] T. Gu, Z. Wu, X. Tao, H. K. Pung, and J. Lu, "epsicar: An emerging patterns based approach to sequential, interleaved and concurrent activity recognition," in Pervasive Computing and Communications, 2009. PerCom 2009. IEEE International Conference on, 2009, pp. 1-9.
[55] P. Rashidi and A. Mihailidis, "A survey on ambient-assisted living tools for older adults," IEEE journal of biomedical and health informatics, vol. 17, pp. 579-590, 2013.
[56] W.-c. Lin, "An Imaged-based Navigation System for the Blind," 2012.
[57] Vitruvian Man. Available: http://englishclass.jp/reading/topic/Vitruvian_Man |