摘要(英) |
The goal of this study is to design an intelligent humanoid robot which not only can walk forward, backward, turn left, turn right, walk sideward, squat down, stand up and bow smoothly, but also can imitate some human motions. The robot’s structure is made up by 17 AI motors with some self- design acrylic board connection. It is like a human with two hands, two feet, and a head. The head is a web camera to serve as an eye of the robot. The eye can recognize the color marks pasted on the human body under any complex background. According to the relative positions of those marks, the robot can recognize the human motion and imitate his motion in time. The imitated human motions include the various motions of hand and the lower body motions like “Stand up”, “Squat down”, “Stand by one foot” etc. Furthermore, the robot also can imitate the forward, backward and side walking of the human. When the robot is moving, the marks pasted on human could be out of the robot’s vision. Therefore, its eye can also automatically search the marks. AI-1001 motors are also as the actuators of the robot. The rotation degree of each motor is controlled by RS-232 protocol. The control strategies include torque control, linear interpolation, trajectory generation, finite state mechanism and center of gravity compensation etc. It should be emphasized that maintaining the stability and balance of the robot, when the robot walks, moves and imitates motions, is the main achievement of this work. Moreover, using color marks to guide the robot to imitate human’s motions (including two or three-dimension motions) is the greatest originality of our work. This thesis is a good demonstration for the study on interaction between human and robot. |
參考文獻 |
[1] K. Erbatur, A. Okazaki, K. Obiya, T. Takahashi, and A. Kawamura,“A study on the zero moment point measurement for biped walking robots,” International Workshop on Advanced Motion Control, pp. 431–436, 2002.
[2] J. H. Kim, and J. H. Oh, “Walking control of the humanoid platform KHR-1 based on torque feedback control,” IEEE International conference on Robotics and Automation, vol.1, pp. 623–628, 2004.
[3] F. Kanehiro, M. Inaba, and H. Inoue, “Development of a two-armed bipedal robot that can walk and carry objects,” IEEE/RSJ International conference on Intelligent Robots and Systems, vol.1, pp. 23–28, 1996.
[4] J. H. Park, and H. Chung, “ZMP compensation by online trajectory generation for biped robots,” IEEE International conference on Systems, Man, and Cybernetics, vol.4, pp. 960–965, 1999.
[5] F. R. Sias, Jr. and Y. F. Zheng, “How Many Degrees-of-Freedom Does a Biped Need?” IEEE International Workshop on Intelligent Robots and Systems, vol. 1, pp. 297–302, July 1990.
[6] S. Nakaoka, A. Nakazawa, F. Kanehiro, K. Kaneko, M. Morisawa, and K. Ikeuchi, “Task model of lower body motion for a biped humanoid robot to imitate human dances?” IEEE/RSJ International conference on Intelligent Robots and Systems, pp. 3157–3162, 2005.
[7] X. J. Zhao, O. Huang, Z. Peng, and K. Li, “Kinematics mapping and similarity evaluation of humanoid motion based on human motion capture,” IEEE/RSJ International conference on Intelligent Robots and Systems, vol.1, pp. 840–845, 2004.
[8] L. Tanco, J. P. Bandera, R. Marfil, and F. Sandoval, “Real-time human motion analysis for human-robot interaction,” IEEE/RSJ International conference on Intelligent Robots and Systems, pp. 1402–1407, 2005.
[9] K. Kosuge, T. Hayashi, Y. Hirata, and R. Tobiyama, “Dance partner robot - Ms DanceR,” IEEE/RSJ International conference on Intelligent Robots and Systems,vol.3, pp. 3459-3464, 2003.
[10] J. W. Grizzle, G. Abba, and F. Plestan, “Asymptotically Stable Walking for Biped Robots: Analysis via Systems with Impulse Effects,” IEEE Transactions on automatic control, vol.46, pp. 51 - 64, 2001.
[11] K. Loffler, M. Gienger, F. Pfeiffer, and H. Ulbrich, “Sensors and control concept of a biped robot,” IEEE Transactions on Industrial Electronics., vol.51, pp. 972-980, 2004.
[12] Q. Huang, K. Yokoi, S. Kajita, K. Kaneko, H. Arai, N. Koyachi, and K. Tanie, “Planning Walking Patterns for a Biped Robot,” IEEE Transactions on Robotic and Automation, vol. 17, no. 3, June 2001.
[13] O. Huang, and Y. Nakamura, “Sensory reflex control for humanoid walking,” IEEE Transactions on Robotic and Automation, vol. 21, Issue 5, pp. 977–984, Oct. 2005.
[14] J. K. Hodgins and M. H. Raibert, “Adjusting step length for rough terrain locomotion,” IEEE Transactions on Robotic and Automation, vol. 7, pp. 289–298, June 1991.
[15] R. C. Gonzalez, R. E. Woods 著, 繆紹綱 譯, 數位影像處理, 台灣培生教育出版: 普林斯頓國際, 2003.
[16] 范逸之、江文賢、陳立元, C++ Builder 與 RS-232串列通訊, 文魁資訊, 2004.
[17] 黃文吉,C++ Builder與影像處理, 儒林圖書公司發行,2005.
[18] 鍾國亮, 影像處理與電腦視覺, 台灣東華書局股份有限公司, 2002
[19] 連國珍, 數位影像處理, 儒林圖書有限公司, 2006.
[20] 晉茂林, 機器人學, 五南圖書出版社有限公司, 2000.
[21] 陳盈翰(王文俊教授 指導), 多自由度雙足機器人之設計與控制實現, 碩士論文, 國立中央大學電機工程學系, 2005.
[22] SONY愛寶狗之網站http://www.sony.jp/products/Consumer/aibo/
[23] 寵物恐龍之網站http://www.ugobe.com/pleo/index.html
[24] 日本仿真人機器人之網站http://tw.myblog.yahoo.com/tsui1king/article?mid=283
[25] Mega Robotics 之網站 http://www.megarobotics.com/en_main.htm |