參考文獻 |
參考文獻
[1] SOUND Foresight Ltd, [Online] Available: http://www.soundforesight.co.uk/index.html Jun. 9, 2012[data accessed]
[2] Currently Available Electronic Travel Aids for the Blind, [Online] Available:
http://www.noogenesis.com/eta/current.html Jun. 9, 2012[data accessed]
[3] K. T. Song and H. T. Chen, “Cooperative Map Building of Multiple Mobile Robots,” in 6th International Conference on Mechatronics Technology, pp.535-540, Kitakyushu, Japan, Sep. 29-Oct. 3, 2002.
[4] K. T. Song and C. Y. Lin, “Mobile Robot Control Using Stereo Vision,” in Proc. of 2001 ROC Automatic Control Conference, pp. 384-389, 2001.
[5] S. T. Tseng and K. T. Song, “Real-time Image Tracking for Traffic Monitoring,” in Proc. of the IEEE 5th International Conference on Intelligent Transportation Systems, pp. 1-6, Singapore, Sep. 3-6, 2002.
[6] S. Tachi and K. Komority, “Guide dog robot,” 2nd Int. Congress on Robotics Research, pp. 333-340, Kyoto, Japan, 1984.
[7] S. Shoval, J. Borenstein, and Y. Koren, “Mobile Robot Obstacle Avoidance in a Computerized Travel Aid for the Blind,” in Proc. of the 1994 IEEE International Conference on Robotics and Automation, pp. 2023-2029, San Diego, CA, May 8-13, 1994.
[8] S. Shoval, J. Borenstein, and Y. Koren, “Mobile Robot Obstacle Avoidance in a Computerized Travel Aid for the Blind,” in Proc. of the 1994 IEEE International Conference on Robotics and Automation, pp. 2023-2029, San Diego, CA, May 8-13, 1994.
[9] J. Borenstein, “The NavBelt–A Computerized Multi-Sensor Travel Aid for Active Guidance of the Blind,” in Proc. of the Fifth Annual CSUN Conference on Technology and Persons With Disabilities, pp. 107-116, Los Angeles, California, March 21-24, 1990.
[10] S. Shoval, J. Borenstein, and Y. Koren, “The Navbelt - A Computerized Travel Aid for the Blind,” in Proc. of the RESNA ’’93 conference, pp. 240-242, Las Vegas, Nevada, June 13-18, 1993.
[11] S. Shoval and J. Borenstein, “The NavBelt – A Computerized Travel Aid for the Blind on Mobile Robotics Technology,” IEEE Transactions on Biomedical Engineering, vol. 45, no. 11, pp.107-116, Nov. 1998.
[12] J. Borenstein and I. Ulrich, “The GuideCane - A Computerized Travel Aid for the Active Guidance of Blind Pedestrians,” in Proc. of the IEEE International Conference on Robotics and Automation, pp. 1283-1288, Albuquerque, NM, April 21-27, 1997.
[13] I. Ulrich and J. Borenstein, “The GuideCane - Applying Mobile Robot Technologies to Assist the Visually Impaired,” IEEE Trans. on Systems, Man, and Cybernetics-Part A: Systems and Humans, vol. 31, no. 2, pp. 131-136, Mar. 2001.
[14] 全球無障礙資訊網, [Online] Available: http://www.batol.net/batol-help/article-summary.asp Jun. 9, 2012[data accessed]
[15] J. Hancock, M. Hebert, and C. Thorpe, “Laser intensity-based obstacle detection Intelligent Robots and Systems, ” in IEEE/RSJ International Conference on Intelligent Robotic Systems, vol. 3, pp. 1541-1546, 1998.
[16] C. Harris and M. Stephens, “A combined corner and edge detector,” in Proc. of the 4th Alvey Vision Conference, pp. 147-151, 1988.
[17] R. Hartley and P. Sturm, “Triangulation,” Computer Vision and Image Understanding, vol. 68, no 2, pp. 146-157, 1997.
[18] B. Heisele and W. Ritter, “Obstacle detection based on color blob flow,” in Proc. Intelligent Vehicles Symposium 1995, pp. 282-286, Detroit, 1995.
[19] W. Kruger, W. Enkelmann, and S. Rossle, “Real-time estimation and tracking of optical flow vectors for obstacle detection,” in Proc. of the Intelligent Vehicles Symposium, pp. 304-309, Detroit, 1995.
[20] M. Bertozzi and A. Broggi, “GOLD: A Parallel Real-Time Stereo Vision System for Generic Obstacle and Lane Detection,” IEEE Trans. on Image Processing, vol.7, no.1, pp. 62-81, 1998.
[21] Q.-T. Luong, J. Weber, D. Koller, and J. Malik, “An integrated stereo-based approach to automatic vehicle guidance,” in 5th International Conference on Computer Vision, pp. 52-57, June 1995.
[22] N. Ayache and F. Lustman, “Trinocular Stereo Vision for Robotics,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol.13, no.1, pp. 73-85, 1991.
[23] H. Ishiguro and S. Tsuji, “Active Vision By Multiple Visual Agents,” in Proc. of the 1992 IEEE/RSJ International Conference on Intelligent Vehicles, vol.3, pp. 2195-2202, 1992.
[24] M. K. Leung, Y. Liu, and T. S. Huang, “Estimating 3d vehicle motion in an outdoor scene from monocular and stereo image sequences,” in Proc. of the IEEE Workshop on Visual Motion, pp. 62-68, 1991.
[25] L. M. Lorigo, R. A. Brooks, and W. E. L. Grimsou, “Visually-Guided Obstacle Avoidance in Unstructured Environments,” in IEEE Conference on Intelligent Robots and Systems, pp. 373-379, Sep. 1997.
[26] J. Hightower and G. Borriello, “Location systems for ubiquitous computing,” IEEE Computer, vol. 32, pp. 57-66, Aug. 2001.
[27] M. Mauve, A. Widmer, and H. Hartenstein, “A survey on position-based routing in mobile ad hoc networks,” IEEE Network, vol. 15, issue 6, pp. 30-39, Nov.-Dec, 2001.
[28] G. Sun, J. Chen, W. Guo, and K. J. R. Liu, “Signal processing techniques in network-aided positioning: a survey of state-of-the-art positioning designs,” IEEE Signal Processing Magazine, vol. 22, issue 4, pp. 12-23, July 2005.
[29] M. S. Uddin and T. Shioyama, “Detection of Pedestrian Crossing Using Bipolarity Feature—An Image-Based Technique,” IEEE Trans. on Intelligent Transportation Systems, vol. 6, no. 4, pp. 439-445, Dec 2005.
[30] T. Shioyama, H. Wu, N. Nakamura, and S. Kitawaki, “Measurement of the length of pedestrian crossings and detection of traffic lights from image data,” Meas. Sci. Technol, vol. 13, no. 9, pp. 1450–1457, Sep. 2002.
[31] S. Se, “Zebra-crossing detection for the partially sighted,” in Proc. Computer Vision and Pattern Recognition, pp. 211–217, 2000.
[32] V. Ivanchenko, J. Coughlan, and H. Shen, “Crosswatch: a camera phone system for orienting visually impaired pedestrians at traffic intersections,” in 11th International Conference on Computers Helping People with Special Needs, Linz, Austria, July 2008.
[33] V. Ivanchenko, J. Coughlan, and H. Shen, “Detecting and Locating Crosswalks using a Camera Phone,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2008.
[34] Kinect for Windows, [Online] Available: http://www.microsoft.com/en-us/kinectforwindows/ Jun. 9, 2012[data accessed]
[35] Kinect for Windows SDK, [Online] Available: http://msdn.microsoft.com/en-us/library/hh855347 Jun. 9, 2012[data accessed]
[36] Microsoft Kinect somatosensory game device full disassembly report _Microsoft XBOX, [Online] Available: http://www.waybeta.com/news/58230/microsoft-kinect-somatosensory-gamedevice-full-disassembly-report-_microsoft-xbox Jun. 9, 2012[data accessed]
[37] L. Gallo, A. P. Placitelli, and M. Ciampi, “Controller-free exploration of medical image data: Experiencing the Kinect,” in Proc. of the 24th IEEE International Symposium on Computer-Based Medical Systems, Los Alamitos, CA, June 27–30, 2011.
[38] I. Oikonomidis, N. Kyriazis, and A. Argyros, “Efficient model-based 3d tracking of hand articulations using kinect,” in Br. Mach. Vis. Conf., Aug, vol. 2, 2011.
[39] Kinect Enabled Autonomous Mini Robot Car Navigation, [Online] Available:
http://www.ubergizmo.com/2010/12/kinect-enabled-autonomous-mini-robot-car-navigation Jun. 9, 2012 [data accessed]
[40] 孫中麒,「低價位之導盲系統」,國立中央大學資訊工程研究所碩士論文,民國九十四年。
[41] H. Bay, A. Ess, T. Tuytelaars, and L. Van Gool, “Speeded-Up Robust Features (SURF),” Computer Vision and Image Understanding, vol.110, no.3, pp.346-359, June, 2008.
[42] C. Harris and M. Stephens, “A combined corner and edge detector,” in Proc. of the Alvey Vision Conference, pp. 147 -151, 1988.
[43] D. G. Lowe, “Distinctive Image Features from Scale-Invariant Keypoints,” International Journal of Computer Vision, vol. 2, no. 60, pp. 91-110, 2004.
[44] K. S. Arun, T. S. Huang, and S. D. Blostein, “Least-Squares Fitting of Two 3-D Point Sets,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 9, no. 5, pp. 698-700, 1987.
[45] Determining yaw, pitch, and roll from a rotation matrix, [Online] Available:
http://planning.cs.uiuc.edu/node103.html Jun. 15, 2012 [data accessed]
[46] 吳成柯、戴善榮、程湘君、雲立實,數位影像處理,儒林出版社,台北,民國八十二年。
[47] N. Otsu, “A threshold selection method from gray-level histogram,” IEEE Trans. Syst. Man Cybern, vol. 9, pp.62-66, 1979.
[48] M. C. Su, Y. Z. Hsieh, and Y. X. Zhao, “A Simple Approach to Stereo Matching and Its Application in Developing a Travel Aid for the Blind,” in The 11th International Conf. on Fuzzy Theory and Technology, pp.1228-1231, Kaohsiung, Taiwan, Oct. 8-11, 2006
[49] M. C. Su, Y. Z. Hsieh, D. Y. Huang, Y. X. Zhao, and C. C. Sun, “A Vision-Based Travel Aid for the Blind, ” in Pattern Recognition Theory and Application, E. A. Zoeller Eds, pp. 73-89, Nova Science Publishers, New York, 2008.
[50] 導盲機械犬, [Online] Available:
http://www.robonable.jp/news/2011/10/nsk-1024.html Jul. 6, 2012 [data accessed]
[51] S. Thrun , W. Burgard, and D. Fox, Probabilistic Robotics, The MIT Press, 2005
|