博碩士論文 92522029 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:28 、訪客IP:18.216.42.225
姓名 孫中麒(Chung-Chi Sun)  查詢紙本館藏   畢業系所 資訊工程學系
論文名稱 低價位之導盲系統
(A Low-Cost Travel-Aid for the Blind)
相關論文
★ 以Q-學習法為基礎之群體智慧演算法及其應用★ 發展遲緩兒童之復健系統研製
★ 從認知風格角度比較教師評量與同儕互評之差異:從英語寫作到遊戲製作★ 基於檢驗數值的糖尿病腎病變預測模型
★ 模糊類神經網路為架構之遙測影像分類器設計★ 複合式群聚演算法
★ 身心障礙者輔具之研製★ 指紋分類器之研究
★ 背光影像補償及色彩減量之研究★ 類神經網路於營利事業所得稅選案之應用
★ 一個新的線上學習系統及其於稅務選案上之應用★ 人眼追蹤系統及其於人機介面之應用
★ 結合群體智慧與自我組織映射圖的資料視覺化研究★ 追瞳系統之研發於身障者之人機介面應用
★ 以類免疫系統為基礎之線上學習類神經模糊系統及其應用★ 基因演算法於語音聲紋解攪拌之應用
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 目前視障人士最普遍所使用的導盲工具是白手杖(White Cane),以之去觸碰地面或是物體以確認前方是否可以行走以及障疑物的位置,而手杖不可及的地方則一概無從得知。目前另一種可行的導盲方式是使用導盲犬,但與全台五萬名盲胞的數目相比較,導盲犬的數目是相當稀少,至今亦是如此,原因是導盲犬的挑選訓練極為不易且成本昂貴,並有壽命的問題,以致導盲犬的使用並不普及。所以當下所需的是一套能協助視障人士的電子導盲系統,有效且即時地改善他們的生活,開發使用便利又成本低廉的導盲輔具是相當重要且刻不容緩的。目前所被使用的導盲輔具以超音波偵測物體為主,但此類型的輔具多必須要使用者主動以超音波偵測障礙物,而非輔具本身主動偵測並告知使用者行走的相關資訊,於是使用者只有在面臨障礙物時才被輔具所引導閃避,並沒有自主選擇行走路徑的能力。
本論文中提出以影像處理的方式,切割出由使用者腳下所延伸出的地板區塊,再經由影像與實際空間的座標轉換,得到路面延伸的寬度以及長度此項路長資訊,對視障人士而言是最迫切需要的資訊,以達到即時解決使用者-「我能走的路延伸多遠? 這條路有多寬? 」的疑惑。藉由此初步的研究,即能令目前導盲的輔具化被動為主動,讓視障人士的行動更為自由。
摘要(英) The most pervasive travel-aid for the blind is the White Cane, which only allows them to ensure that the path is clear and to locate any possible obstacle by using it to touch the ground and objects. The rest of the world beyond reach remains unknown to them. The other option for a blind person to get around is by the help of a guide dog. However, the present population of the blind in Taiwan seriously outnumbers the existing guide dogs. A guide dog training program is very selective as to the dogs to be trained, and it costs a lot of time and money to train while the lifespan of a guide dog is rather limited. As a result, the use of guide dogs is not very popular. Therefore, it is crucial to develop an easy-to-use and cost-efficient electronic guiding system to improve the lives of the visually disabled. At present, a large portion of existing guiding aids uses ultrasonic to detect objects. But instead of actively detecting the surroundings, users need to detect objects by themselves and then avoid whatever possible obstacles. They are not able to choose the path according to their will since this kind of tools only provides passive guidance. In the thesis, we proposed a travel-guide which requires a low-cost web camera and an accelerator circuit. First, we use a segmentation algorithm to segment the floor region from the image captured by the web camera. By the use of the tilting angle measured from the accelerator circuit and a transformation formula we are able to gather information about the width and length of the floor. These two kinds of information are the most critical information for the blind while walking. The information answers their most urgent question about “how wide is the road and how far can I go?” This research explores the ways to help the blind to navigate more actively and be able to move as if they can see. Experiments were conducted to test the performance of the proposed travel-aid.
關鍵字(中) ★ 立體視覺
★ 區域增長
★ 色彩空間
★ 座標轉換
關鍵字(英) ★ color space
★ region growing
★ Stereo-Vision
★ coordinate transformation
論文目次 目錄 VI
表目錄 VIII
第一章 緒論 1
1.1 研究動機 1
1.2 研究目標 2
1.3 論文架構 2
第二章 導盲輔具及其相關研究 3
2.1 電子式行進輔具 3
2.2 引導式機器人 4
2.3 穿戴式輔具 5
2.4 導引式手杖 6
2.5 電子晶片或人工視網膜植入 7
2.6 導盲輔具之探討 7
第三章 研究方法與步驟 10
3.1 色彩空間轉換 11
3.1.1 RGB色彩空間 11
3.1.2 HSV色彩空間 12
3.1.3 RGB與HSV色彩空間之討論 13
3.2 路面影像切割 14
3.2.1 影像資訊量化 14
3.2.2 平均值與變異數之應用 15
3.2.3 樣版比對 ( template matching ) 17
3.2.4 路面區域增長( Region-Growing ) 19
3.2.5 區域增長閥值的自動化設定 21
3.3 行進距離估測 23
3.3.1 世界空間座標與影像座標的轉換 24
3.3.2 路面邊界估測 29
3.4 系統整合 32
3.4.1 硬體環境介紹 32
3.4.2 系統操作流程 34
第四章 實驗結果及問題探討 37
4.1 實際影像之路面增長 37
4.2 實際影像之距離估測 41
4.3 實驗與問題探討 43
第五章 結論與展望 48
5.1 結論 48
5.2 未來研究方向 49
參考文獻 51
參考文獻 [1] A. Albert, M. Suppa, and Wilfried. Gerth, “Detection of Stair Dimensions for the Path Planning of a Bipedal Robot,” IEEE/ASME International Conference on Advanced Intelligent Mechatronics Proceedings, pp. 8-12. Italy, July 2001.
[2] N. Ayache and F. Lustman, “Trinocular Stereo Vision for Robotics,” IEEE Transaction on Pattern Analysis and Machine Intelligence, Vol.13, No.1, pp. 73-85. 1991.
[3] J. Borenstein, ”The NavBelt–A Computerized Multi-Sensor Travel Aid for Active Guidance of the Blind,” Proceedings of the Fifth Annual CSUN Conference on Technology and Persons With Disabilities, pp. 107-116. Los Angeles, California, March 21-24, 1990.
[4] J. Borenstein and I. Ulrich, "The GuideCane - A Computerized Travel Aid for the Active Guidance of Blind Pedestrians," Proceedings of the IEEE International Conference on Robotics and Automation, pp. 1283-1288. Albuquerque, NM, April 21-27, 1997.
[5] M. Bertozzi and A. Broggi, “GOLD: A Parallel Real-Time Stereo Vision System for Generic Obstacle and Lane Detection,” IEEE Transaction on Image Processing, Vol.7, No.1, pp. 62-81. 1998.
[6] J. Brabyn, “New developments in mobility and orientation aids for the blind,” Proceedings of the IEEE Transactions on Biomedical Engineering, Vol.29, pp. 285-289. April 1982.
[7] J. Brabyn, “Orientation and navigation systems for the blind: Overview of different approaches,” Hatfield Conference on Orientation and Navigation Systems for Blind Persons, Hatfield, England. 1995.
[8] Currently Available Electronic Travel Aids for the Blind. (2005). http://www.noogenesis.com/eta/current.html
[9] R. G. Golledge, J. R. Marston, and C. M. Costanzo, “Attitudes of Visually Impaired Persons Toward the Use of Public Transportation.” Journal of Visual Impairment & Blindness, pp. 446-459. September -October 1997.
[10] J. Hancock, M. Hebert, and C. Thorpe, "Laser intensity-based obstacle detection Intelligent Robots and Systems," IEEE/RSJ International Conference on Intelligent Robotic Systems, Vol. 3, pp. 1541-1546. 1998.
[11] C. Harris and M. Stephens, “A combined corner and edge detector," Proceedings of the 4th Alvey Vision Conference, pp. 147-151. 1988.
[12] R. Hartley and P. Sturm, “Triangulation,” Computer Vision and Image Understanding, Vol.68 , No 2, pp. 146-157. 1997.
[13] B. Heisele and W. Ritter, “Obstacle detection based on color blob flow,” Proceedings Intelligent Vehicles Symposium 1995, pp. 282-286. Detroit, 1995.
[14] H. Ishiguro and S. Tsuji, “Active Vision By Multiple Visual Agents,” Proceedings of the 1992 lEEE/RSJ International Conference on Intelligent Vehicles, Vol.3, pp. 2195-2202. 1992.
[15] W. Kruger, W. Enkelmann, and S. Rossle, ”Real-time estimation and tracking of optical flow vectors for obstacle detection,” Proceedings of the Intelligent Vehicles Symposium, pp. 304-309. Detroit, 1995.
[16] L. M. Lorigo, R. A. Brooks and W. E. L. Grimsou, “Visually-Guided Obstacle Avoidance in Unstructured Environments,” IEEE Conference on Intelligent Robots and Systems, pp. 373-379. Sep. 1997.
[17] Q.-T. Luong, J. Weber, D. Koller, and J. Malik, “An integrated stereo-based approach to automatic vehicle guidance,” 5th International Conference on Computer Vision, pp. 52-57. June 1995.
[18] M. K. Leung, Y. Liu, and T. S. Huang, “Estimating 3d vehicle motion in an outdoor scene from monocular and stereo image sequences,” Proceedings of the IEEE Workshop on Visual Motion, pp. 62-68. 1991.
[19] J. M. Loomis, R. G. Golledge, and R. L. Klatzky, ”Personal Guidance system for the visually impaired using GPS, GIS, and VR technologies,” Conference on Virtual Reality and Persons with Disabilities, pp. 71-74 . San Francisco. June 17-18, 1993.
[20] J. M. Loomis, R. G. Golledge, and R. L. Klatzky, ”Personal guidance system for blind persons,” Hatfield Conference on Orientation and Navigation Systems for Blind Persons, Hatfield, England. February 1-2, 1995.
[21] SOUND Foresight Ltd, (2005). http://www.soundforesight.co.uk/index.html
[22] C. Sun, and W. G. Wee, “Neighboring Gray Level Dependence Matrix for Texture Classification,” Computer Vision, Graphics and Image Processing, Vol. 23, pp. 341-352. 1982.
[23] J. Sklansky, “Image Segmentation and Feature Extraction,” IEEE Transaction on Systems, Man, and Cybernetic, Vol. 8, pp. 238-247. 1978.
[24] K. T. Song and H. T. Chen, “Cooperative Map Building of Multiple Mobile Robots,” 6th International Conference on Mechatronics Technology, pp.535-540. Kitakyushu, Japan, Sep. 29-Oct. 3, 2002.
[25] K. T. Song and C. M. Lee, “Development of an Image Processing Card and Its Application to Mobile Manipulation,” Proceedings of 2002 ROC Automatic Control Conference, pp. 819-824. Tainan, Mar. 15-16, 2002.
[26] K. T. Song and Y. H. Chen, “Robot Control in Dynamic Environments Using a Fuzzy Clustering Network,” Proceedings of First IEEE-RAS International Conference on Humanoid Robots, MIT, Cambridge, Sep. 7-8, 2000.
[27] K. T. Song and C. Y. Lin, “Mobile Robot Control Using Stereo Vision,” Proceedings of 2001 ROC Automatic Control Conference, pp. 384-389. 2001.
[28] S. Shoval, J. Borenstein, and Y. Koren, “Mobile Robot Obstacle Avoidance in a Computerized Travel Aid for the Blind,” Proceedings of the 1994 IEEE International Conference on Robotics and Automation, pp. 2023-2029. San Diego, CA, May 8-13, 1994.
[29] S. Shoval, J. Borenstein, and Y. Koren, “The Navbelt - A Computerized Travel Aid for the Blind,” Proceedings of the RESNA '93 conference, pp. 240-242. Las Vegas, Nevada, June 13-18, 1993.
[30] S. Shoval, J. Borenstein, and Y. Koren, “Auditory Guidance With the NavBelt - A Computerized Travel Aid for the Blind,” IEEE Transactions on Systems, Man, and Cybernetics, Vol. 28, No. 3, pp. 459-467. August, 1998.
[31] S. Shoval, and J. Borenstein, “The NavBelt – A Computerized Travel Aid for the Blind on Mobile Robotics Technology,” IEEE Transactions on Biomedical Engineering, Vol. 45, No. 11, pp. 107-116. Nov. 1998.
[32] S. T. Tseng and K. T. Song, “Real-time Image Tracking for Traffic Monitoring,” Proceedings of the IEEE 5th International Conference on Intelligent Transportation Systems, pp. 1-6. Singapore, Sep. 3-6, 2002.
[33] S. Tachi and K. Komority, “Guide dog robot,” 2nd Int. Congress on Robotics Research, pp. 333-340. Kyoto, Japan, 1984.
[34] I. Ulrich and J. Borenstein, “The GuideCane - Applying Mobile Robot Technologies to Assist the Visually Impaired,” IEEE Transaction on Systems, Man, and Cybernetics-Part A: Systems and Humans, Vol. 31, No. 2, pp. 131-136. Mar. 2001.
[35] J. S. Weszka, C. R. Dyer, A. Rosenfeld, “A Comparative Study of Texture Measures for Terrain Classification,” IEEE Transaction on Systems, Man, and Cybernetic, Vol. SMC-6, No. 4, pp. 269-285. 1976.
[36] L. Wang, and D. C. He, “A New Statistical Approach for Texture Analysis,” Photogrammetric Engineering & Remote Sensing, Vol. 56, No. 1, pp. 61-66. 1990.
[37] Y. Xu, E. Saber, and A. M. Tekalp, “Object Segmentation and Labeling by Learning from Examples,” IEEE Transaction on Image Processing, pp. 627-638. 2003.
[38] 史天元,曾義星,「三維雷射掃描儀」,科學發展,365期,pp. 16-21,2003年5月。
[39] 台中市世界聯合保護動物協會,(2005),http://www.lovedog.org.tw/
[40] 台灣導盲犬協會,(2005) [Online],http://www.guidedog.org.tw
[41] 全球無障礙資訊網,(2005) [Online],http://www.batol.net/batol-help/article-summary.asp
[42] 國際導盲犬聯盟( International Guide Dog Federation ),(2005) [Online],http://www.ifgdsb.org.uk/default.asp
指導教授 蘇木春(Mu-Chun Su) 審核日期 2005-7-15
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明