參考文獻 |
[1] W. Freeman and M. Roth, “Orientation histograms for hand gesture recognition,” Int. Work. Autom. Face Gesture Recognit, vol. 12, pp. 296–301, 1995.
[2] VRLOCIC Co.. [Online]. Available: http://www.vrlogic.com/html/5dt/5dt_dataglove_5.html. [Accessed: 06-Jun-2015].
[3] Measurand Inc.. [Online]. Available: http://www.shapehand.com/shapehand.html. [Accessed: 03-Jul-2015].
[4] J. M. Rehg and T. Kanade, “DigitEyes: vision-based hand tracking for human-computer interaction,” in Proc. of the Workshop on Motion of Non-Rigid and Articulated Bodies, pp. 16-22, 1994.
[5] E. Ueda, Y. Matsumoto, M. Imai, and T. Ogasawara, “Hand pose estimation for vision-based human interface,” in Proceedings - IEEE International Workshop on Robot and Human Interactive Communication, pp. 473–478, 2001
[6] A. Causo, M. Matsuo, E. Ueda, K. Takemura, Y. Matsumoto, J. Takamatsu, and T. Ogasawara, “Hand pose estimation using voxel-based individualized hand model,” in IEEE/ASME International Conference on Advanced Intelligent Mechatronics, pp. 451-456, 2009.
[7] R. Yang and S. Sarkar, “Gesture Recognition Using Hidden Markov Models from Fragmented Observations,” in Proc. IEEE Conference Computer Vision and Pattern Recognition, 2006
[8] M. Elmezain, A. Al-Hamadi, G. Krell, and S. El-Etriby, “Gesture Recognition for Alphabets from Hand Motion Trajectory Using Hidden Markov Models,” IEEE International Symposium on Signal Processing and Information Technology, pp. 1192-1197, 2007
[9] M. A. Amin and Y. Hong, “Sign language finger alphabet recognition from gabor-PCA representation of hand gestures,” in Proceedings of the Sixth International Conference on Machine Learning and Cybernetics, vol. 4, pp. 2218–2223, 2007.
[10] Y. Fang, J. Cheng, K. Wang, and H. Lu, “Hand Gesture Recognition Using Fast Multi-scale Analysis,” in Proc. of IEEE international Conference on Image and Graphics, 2007.
[11] M. Vafadar and A. Behrad, “Human hand gesture recognition using motion orientation histogram for interaction of handicapped persons with computer,” in Lecture Notes in Computer Science, vol. 5099, pp. 378–385, 2008.
[12] A. F. Bobick and A. D. Wilson, “A state-based approach to the representation and recognition of gesture,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 19, no. 12, pp. 1325–1337, 1997.
[13] J. F. Lichtenauer, E. A. Hendriks, and M. J. T. Reinders, “Sign language recognition by combining statistical DTW and independent classification,” IEEE Trans. Pattern Anal. Mach. Intell, vol. 30, no. 11, pp. 2040–2046, 2008.
[14] M. V. Lamar, M. S. Bhuiyan, and A. Iwata, “T-CombNET - A Neural Network Dedicated to Hand Gesture Recognition,” Lecture Notes In Computer Science, vol. 1811, pp. 613-622, 2000.
[15] E. Stergiopoulou, N. Papamarkos, and A. Atsalakis, “Hand Gesture Recognition Via a New Self-organized Neural Network,” Prog. Pattern Recognition, vol. 3773, pp. 891–904, 2005.
[16] P. Hong, M. Turk, and T. Huang, “Gesture modeling and recognition using finite state machines,” in Proc. Fourth IEEE International Conference and Gesture Recognition, pp. 410-415, 2000.
[17] M. A. Amin and H. Yan, “Sign Language Finger Alphabet Recognition From Gabor-PCA Representation of Hand Gestures,” in Proc. of the Sixth International Conference on Machine Learning and Cybernetics, pp. 2218-2223, 2007.
[18] 劉東樺,「以適應性膚色偵測與動態歷史影像為基礎之 即時手勢辨識系統 」,私立大同大學資訊工程學系碩士論文,民國九十八年。
[19] P. Premaratne and Q. Nguyen, “Consumer electronics control system based on hand gesture moment invariants,” Computer Vision, Institution of Engineering and Technology, vol.1, no.1, pp. 35-41, Mar. 2007.
[20] Myo. [Online]. Available:
https://www.thalmic.com/myo/. [Accessed: 10-Jun-2015].
[21] Engadget Co.. [Online]. Available:
http://chinese.engadget.com/2008/06/14/toshiba-qosmio-g55-features-spursengine-visual-gesture-controls/.[Accessed: 16-Jun-2015]
[22] Sotouch Co.. [Online]. Available:
http://www.so-touch.com/?id=software&content=air-presenter#/software/air-presenter. [Accessed: 20-Jun-2015]
[23] Xbox Co.. [Online]. Available:
http://www.xbox.com/en-US/kinect. [Accessed: 16-Jun-2015]
[24] Tvvoluse AG Inc.. [Online]. Available:
http://www.win-ni.com/.[Accessed: 18-Jun-2015]
[25] Microsoft Xbox. [Online]. Available:
http://www.microsoft.com/en-us/kinectforwindows/.
[Accessed: 29-May-2015]
[26] Microsoft Xbox. [Online]. Available: http://www.waybeta.com/news/58230/microsoft-kinect-somatosensory-gamedevice-full-disassembly-report-_microsoft-xbox.
[Accessed: 29-May-2015]
[27] Micorsoft Developer Network. [Online]. Available: https://msdn.microsoft.com/en-us/library/hh438998.aspx. [Accessed: 02-May-2015].
[28] Kinect 感應器. [Online]. Available:
http://msdn.microsoft.com/zh-tw/hh367958.aspx. [Accessed: 15-May-2015]
[29] Kinect 深度點密度分布圖. [Online]. Available: http://kheresy.files.wordpress.com/2011/12/depthhistogram.png?w=630. [Accessed: 02-May-2015].
[30] G. Fanelli, J. Gall, and L. V. Gool, “Real Time Head Pose Estimation with Random Regression Forests,” in IEEE Conference on Computer Vision and Pattern Recognition, 2011
[31] NASA World Wind. [Online]. Available: http://worldwind.arc.nasa.gov/features.html. [Accessed: 02-May-2015]
[32] 鄧己正,「以視覺為基礎的人臉辨識理論」,國立中央大學資訊工 程學系碩士論文,民國九十年。
[33] M. Soriano and B. Martinkauppi, “Using the skin locus to cope with changing illumination conditions in color-based face tracking,” in Proc. IEEE Nordic Signal Processing Symposium, pp. 383-386, 2000.
[34] 林文章,「不同場景的膚色偵測與臉部定位」,國立中央大學電機工程學系碩士論文,民國九十八年。
[35] M. Hu, S. Worrall, a H. Sadka, and a a Kondoz, “Face feature detection and model design for 2D scalable model-based video coding,” Int. Conf. Vis. Inf. Eng. VIE 2003 Ideas Appl. Exp., vol. 2003, no. 1, pp. 125–128, 2003.
[36] 蘇芳生,「人臉表情辨識系統」,國立中正大學通訊工程學系碩士論文,民國九十三年。
[37] 曾郁展,「DSP-Based 之即時人臉辨識系統」,國立中山大學電機工程學系碩士論文,民國九十四年。
[38] 蔡嵩陽,「即時手型辨識系統及其於家電控制之應用」,國立中央大學資訊工程學系碩士論文,民國一百年。
[39] J. Sauro, “A Practical Guide to the System Usability Scale: Background, Benchmarks & Best Practices.” , 2014
|