參考文獻 |
[1] Y. Ando and S. Yuta, “Following a Wall by an Autonomous Mobile Root with a Sonar-Ring,” IEEE International Conference on Robotics and Automation, vol. 4, pp. 2599-2606, 1995.
[2] N. Ayache and F. Lustman, “Trinocular Stereo Vision for Robotics,” IEEE Transaction on Pattern Analysis and Machine Intelligence, vol. 13, no. 1, pp. 73-85, 1991.
[3] P. Bahl and V. N. Padmanabhan, “RADAR: an in-building RF-based user location and tracking system,” in INFOCOM 2000. Nineteenth Annual Joint Conference of the IEEE Computer and Communications, vol. 2, pp. 775-784, March 2000.
[4] M. Bertozzi and A. Broggi, “GOLD: A Parallel Real-Time Stereo Vision System for Generic Obstacle and Lane Detection,” IEEE Transaction on Image Processing, vol. 7, no. 1, pp. 62-81, 1998.
[5] J. Borenstein and Y. Koren, “Real-Time Obstacle Avoidance for Fast Mobile Robots,” IEEE Trans. Systems, Man, and Cybernetics, vol. 19, no. 5, pp. 1179-1187, 1989.
[6] R. Cassinis, D. Grana, and A. Rizzi, “Using Colour Information in an Omnidirectional Perception System for Autonomous Robot Localization,” Proceedings of the First Euromicro Workshop on Advanced Mobile Robot, pp. 172-176, Oct. 1996.
[7] C. T. Chang, “Design of Obstacle Avoidance and Navigation Strategies for an Automatic Guided Vehicle Using Camera Vision and Infrared Sensing,” Master Thesis, Electrical Engineering, N.T.U.T., 2001.
[8] A. Clerentin, L. Delahoche, and E. Brassart, “Cooperation between two omnidirectional perception systems for mobile robot localization,” IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 2, pp. 1499-1504, Nov. 2000.
[9] S. Ernst, C. Stiller, J. Goldbeck, and C. Roessig, “Camera calibration for lane and obstacle detection,” Proc. IEEE/IEEJ/JSAI International Conference on Intelligent Transportation Systems, pp. 356-361, Oct. 1999.
[10] E. Frontoni and P. Zingaretti, “A vision based algorithm for active robot localization,” Proc. IEEE International Symposium on Computational Intelligence in Robotics and Automation, pp. 347-352, June 2005.
[11] R. C. Gonazlez and R. E. woods, Digital image processing, 2nd. Addison-wesley, 1992.
[12] H. Haddad, M. Khatib, S. Lacroix, and R. Chatila, “Reactive navigation in outdoor environments using potential fields,” Proc. IEEE International Conference on Robotics and Automation, vol. 2, pp. 1232-1237, May 1998.
[13] Y. Han and H. Hahn, “Localization and Classification of Target Surfaces Using Two Pairs of Ultrasonic Sensors,” Elsevier Science on Robotics and Autonomous Systems, vol. 1, pp. 31-41, 2000.
[14] H. Ishiguro and S. Tsuji, “Active Vision By Multiple Visual Agents,” Proc. lEEE/RSJ International Conference on Intelligent Vehicles, vol. 3, pp. 2195-2202, 1992.
[15] G. Jang, S. Kim, J. Kim, and I. Kweon, “Metric localization using a single artificial landmark for indoor mobile robots,” IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2857-2862, Aug. 2005.
[16] M. R. Kabuka and A. E. Arenas, “Position Verification of a Mobile Robot Using Standard Pattern,” IEEE Journal of Robotics and Automation, vol. 3, no. 6, pp. 505-516, Dec. 1987.
[17] E. Kruse and F.M. Wahl, “Camera-based observation of obstacle motions to derive statistical data for mobile robot motion planning,” Proc. IEEE International Conference on Robotics and Automation, vol. 1, pp. 662-667, May 1998.
[18] K. Lawton and E. Shrecengost, “The Sony AIBO: Using IR for Maze Navigation, ” Tekkotsu:Homepage, Available: http://www.cs.cmu.edu/ ~tekkotsu/index.html.
[19] Y. W. Lin, “The Research of robot Building Map with an Ultrasonic Sensor,” Master Thesis, Department of Engineering Science, N.C.K.U., 2004.
[20] L. M. Lorigo, R. A. Brooks, and W. E. L. Grimsou, “Visually-Guided Obstacle Avoidance in Unstructured Environments,” IEEE Conference on Intelligent Robots and Systems, vol. 1, pp. 373-379, Sep. 1997.
[21] Q. T. Luong, J. Weber, D. Koller, and J. Malik, “An integrated stereo-based approach to automatic vehicle guidance,” 5th International Conference on Computer Vision, pp. 52-57, June 1995.
[22] Y. Matsumoto, M. Inaba, and H. Inoue, “Visual navigation using view-sequenced route representation,” Proc. IEEE International Conference on Robotics and Automation, vol. 1, pp. 83-88, Apr. 1996.
[23] L. Montano and J. R. Asensio, “Real-time robot navigation in unstructured environments using a 3D laser rangefinder,” Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 2, pp. 526-532, 1997.
[24] A. Ohya, A. Kosaka, and A. Kak, “Vision-based Navigation by a Mobile Robot with Obstacle Avoidance using Single-Camera Vision and Ultrasonic sensing,” IEEE Transactions on Robotics and Automatic, vol. 14, no. 6, pp. 969-978, Dec. 1998.
[25] E. M. Petriu, “Automated Guided Vehicle with Absolute Encoded Guide-path,” IEEE Transactions on Robotics and Automation, vol. 7, no. 4, pp. 562-565, Aug. 1991.
[26] C. C. Sun, “A Low-Cost Travel-Aid for the Blind,” Master Thesis, Department of Computer Science and Information Engineering, N.C.U., 2005.
[27] C. Thorpe, M. H. Hebert, T. Kanade, and S. A. Shafer, “Vision and Navigation for the Carnegie-Mellon Navlab,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 10, no. 3, pp. 362-373, May 1988.
[28] P. Veelaert and W. Bogaerts, “Ultrasonic Potential Field Sensor for Obstacle Avoidance,” IEEE Trans. on Robotics and Automation, vol. 15, no. 4, Aug. 1999.
[29] M. A. Youssef, A. Agrawala, and A. U. Shankar, “ WLAN location determination via clustering and probability distributions,” Proceedings of the First IEEE International Conference on Pervasive Computing and Communications, pp. 143-150, March 2003.
[30] iRobot Corporation, Available: http://www.irobot.com/
[31] SECOM IS Lab. - Service Robot Group, Available: http://www.secom.co.jp/isl/e/org/CTD/srg/index.html
[32] Sony Global - AIBO Global Link, Available: http://www.sony.net/Products/aibo/
[33] 蘇木春,張孝德,「機器學習:類神經網路、模糊系統以及基因演算法則」,全華,1999。
[34] 蔡明志,「資料結構,使用C++」,碁峰資訊,1999。 |