參考文獻 |
[1] Microsoft Kinect [Online] Available: http://www.microsoft.com/en-us/kinectforwindows/discover/features.aspx Jun. 26, 2012[data accessed]
[2] G. Fanelli, J. Gall, and L. V. Gool, “Real Time Head Pose Estimation with Random Regression Forests,” in IEEE Conference on Computer Vision and Pattern Recognition, 2011.
[3] Andrew D. Wilson, “Using a Depth Camera As a Touch Sensor,” in Proc. Of ACM International Conference on Interactive Tabletops and Surfaces, pp. 69–72, 2010.
[4] 國立故宮博物院 [Online] Available: http://www.npm.gov.tw/exh99/npm_digital/ch3.html Jun. 26, 2012[data accessed]
[5] 第四屆台北數位藝術節KT科藝獎scratch互動音樂桌[Online] Available: http://www.youtube.com/watch?v=BSutayY9ik0 Jun. 26, 2012[data accessed]
[6] J.M. Rehg and T. Kanade, “Digiteyes: Vision-based hand tracking for human-computer interaction,” in Proc. of the Workshop on Motion of Non-Rigid and Articulated Bodies, pp. 16-22, 1994.
[7] E. Ueda, Y. Matsumoto, M. Imai, and T. Ogasawara, “Hand Pose Estimation for Vision-based Human Interface,” IEEE Trans. on Industrial Electronics, vol. 50, no. 4, pp. 676-684, 2003.
[8] A. Causo, M. Matsuo, E. Ueda, K. Takemura, Y. Matsumoto, J. Takamatsu, and T. Ogasawara, “Hand pose estimation using voxel-based individualized hand model,” in IEEE/ASME International Conference on Advanced Intelligent Mechatronics, 2009, pp. 451-456, 14-17 Jul. 2009.
[9] R. Yang and S. Sarkar, “Gesture Recognition Using Hidden Markov Models from Fragmented Observations,” in Proc. IEEE Conference Computer Vision and Pattern Recognition, 2006
[10] M. Elmezain and A. Al-Hamadi, “Gesture Recognition for Alphabets from Hand Motion Trajectory Using Hidden Markov Models,” 2007 IEEE International Symposium on Signal Processing and Information Technology, pp. 1192-1197, 15-18 Dec. 2007
[11] M. A. Amin and H. Yan, “Sign Language Finger Alphabet Recognition From Gabor-PCA Representation of Hand Gestures,” in Proc. of the Sixth International Conference on Machine Learning and Cybernetics, pp. 2218-2223, 2007.
[12] Y. Fang, J. Cheng, K. Wang, and H. Lu, “Hand Gesture Recognition Using Fast Multi-scale Analysis,” in Proc. of IEEE international Conference on Image and Graphics, 2007.
[13] M. Vafadar and A. Behrad, “Human Hand Gesture Recognition Using Motion Orientation Histogram for Interaction of Handicapped Persons with Computer,” Lecture Notes In Computer Science, vol. 5099, pp. 378-385, 2008.
[14] A. Bobick and A.Wilson, “A state based approach to the representation and recognition of gesture,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 19, no. 12, pp. 1325–1337, Dec. 1997
[15] J. F. Lichtenauer, E. A. Hendriks, and M. J. T. Reinders, “Sign Language Recognition by Combining Statistical DTW and Independent Classification,” IEEE Trans. On Pattern Analysis and Machine Intelligence, vol. 30, no. 11, pp. 2040-2046, 2008.
[16] M. V. Lamar, M. S. Bhuiyan, and A. Iwata, “T-CombNET - A Neural Network Dedicated to Hand Gesture Recognition,” Lecture Notes In Computer Science, vol. 1811, pp. 613-622, 2000.
[17] E. Stergiopoulou, N. Papamarkos, and A. Atsalakis, “Hand Gesture Recognition Via a New Self-organized Neural Network,” Lecture Notes In Computer Science, vol. 3773, pp. 891-904, 2005.
[18] P. Hong, M. Turk, and T. Huang, “Gesture modeling and recognition using finite state machines,” in Proc. Fourth IEEE International Conference and Gesture Recognition, pp. 410-415, 2000.
[19] M. A. Amin and H. Yan, “Sign Language Finger Alphabet Recognition From Gabor-PCA Representation of Hand Gestures,” in Proc. of the Sixth International Conference on Machine Learning and Cybernetics, pp. 2218-2223, 2007.
[20] 鋼琴玩具圖[Online] Available: http://udn.gohappy.com.tw/shopping/Browse.do?op=vp&sid=9&cid=60974&pid=1225174 Jun. 26, 2012[data accessed]
[21] 鋼琴手套[Online] Available: http://www.diytrade.com/china/3/products/6813317/%E9%8B%BC%E7%90%B4%E6%89%8B%E5%A5%97.html Jun. 26, 2012[data accessed]
[22] Saxophone Toy [Online] Available: http://www.amazon.com/The-Little-Toy-Co-120580/dp/B000OP31KC/ref=pd_sim_t_1 Jun. 26, 2012[data accessed]
[23] Trumpet Toy [Online] Available: http://www.amazon.com/The-Little-Toy-Co-120571/dp/B000P4XR56/ref=cm_cmu_pg__header Jun. 26, 2012[data accessed]
[24] 吉他英雄[Online] Available: http://hub.guitarhero.com/ Jun. 26, 2012[data accessed]
[25] Air Guitar Pro[Online] Available: http://www.geekalerts.com/air-guitar-pro-infrared-ray-instead-of-strings/ Jun. 26, 2012[data accessed]
[26] 敲擊木琴[Online] Available: http://twins.shop.rakuten.tw/200000000898422/ Jun. 26, 2012[data accessed]
[27] 小提琴音樂玩具[Online] Available: http://buy.yahoo.com.tw/gdsale/gdsale.asp?gdid=1727278&co_servername=d2b13b620e66407ad4c0547de27c5982# Jun. 26, 2012[data accessed]
[28] S. Jorda, G. Geiger, M. Alonso, and M. Kaltenbrunner, “The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces,” in 1st International Conference on Tangible and Embedded Interaction, pp. 139-146, 2007.
[29] G. Weinberg, “The beatbug – evolution of a musical controller,” Digital Creativity, vol. 19, no 1, pp. 13-18, 2008.
[30] J. Nielsen, N. K. Barendsen, and C. Jessen, “RoboMusicKids- Music education with robotic building blocks,“ in 2nd IEEE International Conference on Digital Game and Intelligent Toy Enhanced Learning, pp. 149-156, 2008.
[31] J. Zigelbaum, A. Millner, B. Desai, H. Ishii, “BodyBeats: whole-body, musical interfaces for children.” in Extended Abstracts of Conference on Human Factors in Computing Systems. ACM Press, pp. 1595–1600, 2006.
[32] H. K. Min, “SORISU: Sound with Numbers“, in 9th International Conference on New Interfaces for Musical Expression, 2009.
[33] S. Chun, A. Hawryshkewich, K. Jung, and P. Pasquier, “Freepad: A Custom Paper-based MIDI Interface,” in 10th International Conference on New Interfaces for Musical Expression, 2010.
[34] G. Geiger, “Using the Touch Screen as a Controller for Portable Computer Music Instruments,” in 6th International Conference on New Interfaces for Musical Expression, 2006.
[35] G. Schiemer and M. Havryliv, “Pocket Gamelan: interactive mobile music performance,” in Proc. of Mobility Conference: 4th International Conference on Mobile Technology, Applications and Systems, pp. 716-719, 2007
[36] M. Rohs, G. Essl, and M. Roth, “CaMus: Live music performance using camera phones and visual grid tracking,” in Proc. of the 6th International Conference on New Instruments for Musical Expression, pp. 31-36, Jun. 2006.
[37] M. Rohs and G. Essl, “Camus2: optical flow and collaboration in camera phone music performance,” in 7th International Conference on New Interfaces for Musical Expression, pp. 160–163, 2007.
[38] Apple, Inc. [Online] Available: http://www.apple.com/iphone/ Jun. 26, 2012[data accessed]
[39] G. Weinberg, A. Beck and M. Godfrey. “ZooZBeat: a Gesture-based Mobile Music Studio,” in 9th International Conference on New Interfaces of Musical Expression, pp. 312-315, Pittsburgh, PA, USA, 2009,.
[40] String Trio, [Online] Available: http://itunes.apple.com/app/string-trio/id342414859?mt=8 Jun. 26, 2012[data accessed]
[41] G. Wang, “Designing smule’’s iphone ocarina,” in Proc. of the 9th International Conference on New Interfaces for Musical Expression, Pittsburgh, 2009.
[42] Leaftrombone, [Online] Available: http://leaftrombone.smule.com/ Jun. 26, 2012[data accessed]
[43] 林志杰,新版MIDI玩家手冊,第3波出版社,民國八十三年。
[44] MIDI 事件[Online] Available: http://www.sonicspot.com/guide/MIDI files.html Jun. 26, 2012[data accessed]
[45] GM音色表[Online]Available: http://www.gtxs.com.tw/tech/tech_music/gm_list.htm Jun. 26, 2012[data accessed]
[46] K. S. Arun, Thomas Huang and S. D. Blostein, “Least-Squares Fitting of Two 3-D Point Sets,” IEEE Trans. Analysis and Machine Intelligence, vol.9, no.5, pp. 698-700, 1987.
[47] 蔡嵩陽,「即時手型辨識系統及其於家電控制之應用」,國立中央大學資訊工程研究所碩士論文,民國一百年。
[48] 吳成柯、戴善榮、程湘君、雲立實,數位影像處理,儒林圖書有限公司,台北市,民國九十年十月。
[49] B. F. Wu, S. P. Lin, and C. C. Chiu, “Extracting characters from real vehicle licence plates out-of-doors,” Computer Vision, IET , vol.1, no.1, pp. 2-10, Mar. 2007.
[50] M. K. Hu, “Visual pattern recognition by moment invariants,” IRE Transactions on Information Theory, vol.8, no.2, pp. 179-187, Feb. 1962.
[51] R. B. Rusu, “Semantic 3D Object Maps for Everyday Manipulation in Human Living Environments,” PhD thesis, Computer Science department, Technischen Universitat Munchen, Germany, 2009
|