博碩士論文 111322085 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:16 、訪客IP:3.137.184.33
姓名 曹軒慈(Syuan-Tsi Tsao)  查詢紙本館藏   畢業系所 土木工程學系
論文名稱 基於微型機器學習的智能避障系統在外牆檢測自主移動機器人中的應用
相關論文
★ 應用智慧標籤及數據驅動方法於水接觸結構物之結構評估★ 基於低功耗嵌入式系統及高精度MEMS感測器的智慧鋼索監測系統研發
★ Sensor Code-based Smart Tag Embedded in Concrete for Seepage Sensing Caused by Cracks★ 智慧型居家機器人用於地震後自動巡查及應變處置之研究
★ 利用UAV整合LoRa與磁導喚醒技術的物聯網架構研發★ 基於磁吸附與全向輪技術的鋼結構攀爬機器人開發與驗證
★ 基於ROS的遠端自動多螺栓 檢測機器人系統開發★ 基於BERT語意分析模型的智慧型BIM資訊搜尋問答系統之研究
★ 基於BIM與無線喚醒物聯網裝置之智慧化結構檢測系統開發★ 利用微型機器學習與微控制器即時檢測室內地磚空心缺陷
★ 結合智慧感測標籤與支持向量機快速判定混凝土裂縫位置★ 應用於鋼結構檢測之高機動型蚇蠖攀爬機器人設計分析及實作驗證
★ 混凝土缺陷自動修補機器人之研發★ 研發具邊緣運算能力之無線振動量測裝置應用於橋梁鋼索特徵頻率偵測
★ 結合智慧感測標籤與機器學習方法判別混凝土內部鋼筋鏽蝕可能性之研究★ 應用於攀爬檢測機器人之輕量級即時多目標螺栓缺陷影像檢測系統之研究
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 (2026-6-30以後開放)
摘要(中) 本研究提出一種基於微型機器學習可進行AI智能自主移動避障的機器人GLEWBOT-VISION。該機器人解決了在遇到磁磚較大缺損處時吸附失效的問題。GLEWBOT-VISION系統利用FOMO視覺辨識模型應用於微型鏡頭中,能在偵測到磁磚缺陷後立即執行相應的避障動作。系統採用了輕量化的FOMO模型,能夠即時偵測出不同位置和形狀的瓷磚缺陷。
為了驗證系統的有效性,本研究進行了多組實驗,包括視覺辨識模組測試和實際應用情況下的測試。實驗結果顯示,整體模型的精準度高達95%,在真實牆面上應用於不同方向偵測時的精準度也達到95%。在不同的鏡頭覆蓋範圍下,其精準度分別為:僅覆蓋25%面積的缺陷時為80%、覆蓋一半面積的缺陷時為90%、完全覆蓋時為95%。後續實驗中展示了系統在牆面上應用的實際效果。
本研究的創新之處在於將微型機器學習、FOMO視覺辨識與音訊分析技術相結合,實現了GLEWBOT-VISION的自主移動避障功能。這不僅降低了GLEWBOT-VISION吸附於磁磚缺陷處時吸附失效的可能性,還減少了對專業技術人員的依賴,降低了人力成本。同時,系統設計考慮了現場應用的便利性和靈活性,能夠檢測到不同位置和形狀的目標物。未來,該系統還可以進一步擴展應用於其他類型的智能移動設備中,具有廣泛的發展潛力和應用價值。
摘要(英) This study presents GLEWBOT-VISION, an AI-driven autonomous obstacle avoidance robot system integrating the GLEWBOT with the FOMO visual recognition model and miniature machine learning technology. It addresses GLEWBOT′s suction failure on defective tiles by using the FOMO model with a miniature camera for real-time defect detection and obstacle avoidance. The lightweight FOMO model effectively detects various tile defect positions and shapes.
Multiple experiments validated the system′s effectiveness, showing a 95% accuracy overall, consistent in real-world applications. Accuracy varied with camera coverage: 80% at 25% coverage, 90% at half, and 95% at full defect coverage. Subsequent tests confirmed the system′s real-world performance on wall surfaces.
This study innovatively combines miniature machine learning, FOMO visual recognition, and audio analysis, enabling autonomous obstacle avoidance for GLEWBOT-VISION. It reduces suction failures, lowers reliance on technicians, and cuts labor costs. The design also ensures convenience and flexibility for field applications. This system has potential for further development and application in other intelligent mobile devices.
關鍵字(中) ★ 自主移動仿生攀爬機器人
★ 外牆磁磚檢測
★ 真空泵浦吸盤
★ FOMO影像辨識技術
★ 自主避障
★ 即時影像處理
★ 3D列印技術
★ 安全檢測
★ 建築物安全性
關鍵字(英) ★ autonomous bionic climbing robot
★ exterior wall tile inspection
★ vacuum pump suction cups
★ FOMO image recognition technology
★ autonomous obstacle avoidance
★ real-time image processing
★ 3D printing technology
★ safety inspection
★ building safety
論文目次 摘要 v
Abstract vi
致謝 vii
圖目錄 x
表目錄 xiv
第一章 緒論 1
1-1 研究背景及動機 1
1-2 研究目的 2
1-3 論文架構 3
第二章 文獻回顧 4
2-1 攀爬機器人吸附方式相關文獻 4
2-2 外牆磁磚攀爬機器人相關文獻 6
2-3 影像辨識相關文獻 8
第三章 研究方法 12
3-1 具智能避障系統的外牆檢測GLEWBOT-VISION架構 12
3-2 GLEWBOT-VISION設計與實現 14
3-2-1 GLEWBOT-VISION機構設計 15
3-2-2 GLEWBOT-VISION整體機構機電整合與控制 23
3-2-3 GLEWBOT-VISION運動與避障模式 24
3-3 基於微型機器學習的AI視覺缺陷辨識 29
3-3-1 基於FOMO 演算法的網路架構 (Network architecture of FOMO algorithm) 31
3-3-2 視覺辨識模型資料收集與訓練 42
3-3-3 FOMO模型訓練探討 43
3-4 GLEWBOT-VISION演算法與程式架構 47
3-4-1 應用於GLEWBOT-VISION中動作控制系統之演算法 47
3-4-2 應用於GLEWBOT-VISION中影像檢測系統之演算法 49
第四章 實驗設計與規劃 52
4-1 真空泵浦電壓對承重能力之影響 52
4-2 實驗牆之邊界與缺陷設置 54
4-3 AI視覺辨識模組驗證實驗 55
4-4 GLEWBOT-VISION避障能力測試 58
4-4-1 GLEWBOT-VISION單缺陷避障實驗 58
4-4-2 GLEWBOT-VISION雙缺陷避障實驗 59
4-4-3 GLEWBOT-VISION連續缺陷避障實驗 60
第五章 實驗結果與討論 62
5-1 機構整體的吸力分析 62
5-2 易破壞元件之有限元分析 63
5-3 FOMO模型訓練探討 66
5-4 GLEWBOT-VISION之避障能力分析 92
5-4-1 GLEWBOT-VISION單缺陷避障實驗結果 93
5-4-2 GLEWBOT-VISION雙缺陷避障實驗結果 93
5-4-3 GLEWBOT-VISION連續缺陷避障實驗結果 95
第六章 結論與建議 100
6-1 結論 100
6-2 建議 101
參考文獻 102
參考文獻 [1] J. Duan, B. Wang, B. Ji, W. Sun, Z. Wang, and Z. Dai, "Control Strategy of Stable Climbing Mechanics for Gecko-Inspired Robot on Vertical Arc Surface," IEEE Instrumentation & Measurement Magazine, vol. 26, no. 2, pp. 48-56, 2023.
[2] G. Fang and J. Cheng, "Advances in Climbing Robots for Vertical Structures in the Past Decade: A Review," Biomimetics, vol. 8, no. 1, p. 47, 2023.
[3] H. Wang and A. Yamamoto, "Analyses and solutions for the buckling of thin and flexible electrostatic inchworm climbing robots," IEEE Transactions on Robotics, vol. 33, no. 4, pp. 889-900, 2017.
[4] H. Wang and A. Yamamoto, "A thin electroadhesive inchworm climbing robot driven by an electrostatic film actuator for inspection in a narrow gap," in 2013 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), 2013: IEEE, pp. 1-6.
[5] Y. Yoshida and S. Ma, "Design of a wall-climbing robot with passive suction cups," in 2010 IEEE International Conference on Robotics and Biomimetics, 2010: IEEE, pp. 1513-1518.
[6] N. A. F. Provi, M. Hasan, and M. R. Uddin, "Glass-Cleaning Robot That Scales Walls," in 2023 IEEE Region 10 Symposium (TENSYMP), 2023: IEEE, pp. 1-4.
[7] S. B. Alam, S. S. Islam, M. R. Uddin, and K. M. Salim, "48V Three-Phase Induction Motor Development for a Portable Industrial Vacuum Cleaner by using LV VFD Controller," in 2020 11th International Conference on Electrical and Computer Engineering (ICECE), 2020: IEEE, pp. 254-257.
[8] F. G. P. John, S. Eswaramoorthy, N. Paramanandham, S. S. Kumar, V. S. Udayasuriyan, and T. S. V. Subramaniam, "Wheeled wall climbing robot with suction chamber: Building process and analysis," in AIP Conference Proceedings, 2024, vol. 2898, no. 1: AIP Publishing.
[9] Y. Zhang et al., "Design and Modeling of a Novel Mobile Wall-Climbing Robot with a Long-span Foldable Mechanical Arm," in 2023 IEEE International Conference on Robotics and Biomimetics (ROBIO), 2023: IEEE, pp. 1-6.
[10] Z. Li, Z. Li, L. M. Tam, and Q. Xu, "Design and Development of a Versatile Quadruped Climbing Robot With Obstacle-Overcoming and Manipulation Capabilities," IEEE/ASME Transactions on Mechatronics, 2022.
[11] N. Yu, W. Feng, F. Zhang, and H. Yu, "Design of a cross-frame barrier-crossing and wall-climbing cleaning robot," in 2023 35th Chinese Control and Decision Conference (CCDC), 2023: IEEE, pp. 4941-4947.
[12] Y.-W. Song, J. Kang, and S.-C. Yu, "Development of Dual-Unit Ceiling Adhesion Robot System With Passive Hinge for Obstacle Traversal Under Kinodynamic Constraints," IEEE Access, vol. 11, pp. 4486-4500, 2023.
[13] F. Xu, B. Wang, J. Shen, J. Hu, and G. Jiang, "Design and realization of the claw gripper system of a climbing robot," Journal of Intelligent & Robotic Systems, vol. 89, pp. 301-317, 2018.
[14] S. Kim, M. Spenko, S. Trujillo, B. Heyneman, V. Mattoli, and M. R. Cutkosky, "Whole body adhesion: hierarchical, directional and distributed control of adhesive forces for a climbing robot," in Proceedings 2007 IEEE International Conference on Robotics and Automation, 2007: IEEE, pp. 1268-1273.
[15] S. Li and A. D. Calderon, "Review of the Current Research Status and Constraints of Wall-climbing Robots," in 2023 6th International Conference on Robotics, Control and Automation Engineering (RCAE), 2023: IEEE, pp. 44-49.
[16] S. Bian, Y. Wei, F. Xu, and D. Kong, "A four-legged wall-climbing robot with spines and miniature setae array inspired by longicorn and gecko," Journal of Bionic Engineering, vol. 18, pp. 292-305, 2021.
[17] S. Bian, F. Xu, Y. Wei, and D. Kong, "a novel type of wall-climbing robot with a gear transmission system arm and adhere mechanism inspired by Cicada and Gecko," Applied Sciences, vol. 11, no. 9, p. 4137, 2021.
[18] H. Rodríguez, V. Pérez, and O. Echeverría, "Swerve Drive Autonomous Robot for Tiles Thermographic Inspection," in Climbing and Walking Robots Conference, 2023: Springer, pp. 58-68.
[19] W. Wang, S. Cai, C. Ma, W. Li, and J. Liu, "Multi-objective optimization design of an inchworm climbing robot," in 2020 5th International Conference on Advanced Robotics and Mechatronics (ICARM), 2020: IEEE, pp. 189-193.
[20] T.-H. Lin, P.-C. Chiang, and A. Putranto, "Multispecies hybrid bioinspired climbing robot for wall tile inspection," Automation in Construction, vol. 164, p. 105446, 2024.
[21] Q. Zhang, F. Yan, W. Song, R. Wang, and G. Li, "Automatic obstacle detection method for the train based on deep learning," Sustainability, vol. 15, no. 2, p. 1184, 2023.
[22] H. Ren, "Path Avoidance System of Intelligent Robot Based on Computer Vision," in Journal of Physics: Conference Series, 2023, vol. 2493, no. 1: IOP Publishing, p. 012016.
[23] S. Manglani, "Real-time Vision-based Navigation for a Robot in an Indoor Environment," arXiv preprint arXiv:2307.00666, 2023.
[24] Y. Lei, T. Tian, B. Jiang, F. Qi, F. Jia, and Q. Qu, "Research and Application of the Obstacle Avoidance System for High-Speed Railway Tunnel Lining Inspection Train Based on Integrated 3D LiDAR and 2D Camera Machine Vision Technology," Applied Sciences, vol. 13, no. 13, p. 7689, 2023.
[25] T.-V. Dang and N.-T. Bui, "Obstacle avoidance strategy for mobile robot based on monocular camera," Electronics, vol. 12, no. 8, p. 1932, 2023.
[26] L. Yang and W. Lei, "Computer vision positioning and local obstacle avoidance optimization based on neural network algorithm," Computational Intelligence and Neuroscience, vol. 2022, no. 1, p. 3061910, 2022.
[27] A. Singh, M. Shakeel, V. Kalaichelvi, and R. Karthikeyan, "A Vision-Based Bio-Inspired Reinforcement Learning Algorithms for Manipulator Obstacle Avoidance," Electronics, vol. 11, no. 21, p. 3636, 2022.
[28] M. Pires, P. Couto, A. Santos, and V. Filipe, "Obstacle detection for autonomous guided vehicles through point cloud clustering using depth data," Machines, vol. 10, no. 5, p. 332, 2022.
[29] A.-T. Nguyen and C.-T. Vu, "Obstacle avoidance for autonomous mobile robots based on mapping method," in International Conference on Advanced Mechanical Engineering, Automation and Sustainable Development, 2021: Springer, pp. 810-816.
[30] F. Umam, M. Fuad, I. Suwarno, A. Ma′arif, and W. Caesarendra, "Obstacle avoidance based on stereo vision navigation system for omni-directional robot," Journal of Robotics and Control (JRC), vol. 4, no. 2, pp. 227-242, 2023.
[31] P. Wenzel, T. Schön, L. Leal-Taixé, and D. Cremers, "Vision-based mobile robotics obstacle avoidance with deep reinforcement learning," in 2021 IEEE International Conference on Robotics and Automation (ICRA), 2021: IEEE, pp. 14360-14366.
[32] Y. Shou, "Obstacle avoidance path planning algorithm of an embedded robot based on machine vision," Mobile Information Systems, vol. 2022, no. 1, p. 3068614, 2022.
[33] Y. Zou, "Obstacle avoidance and environmental adaptability analysis of snake-like robot based on deep learning," in Journal of Physics: Conference Series, 2022, vol. 2146, no. 1: IOP Publishing, p. 012037.
[34] N. S. Widodo and A. Pamungkas, "Machine vision-based obstacle avoidance for mobile robot," Jurnal Ilmiah Teknik Elektro Komputer dan Informatika, vol. 5, no. 2, pp. 77-84, 2019.
[35] A. HİÇDURMAZ and A. TUNCER, "Real-time obstacle avoidance based on floor detection for mobile robots," Sakarya University Journal of Science, vol. 24, no. 5, pp. 845-853, 2020.
[36] K. K. A. Farag, H. H. Shehata, and H. M. El-Batsh, "Mobile robot obstacle avoidance based on neural network with a standardization technique," Journal of Robotics, vol. 2021, no. 1, p. 1129872, 2021.
[37] R. Mothe, S. T. Reddy, G. Sunil, and C. Sidhardha, "An iot based obstacle avoidance robot using ultrasonic sensor and arduino," in IOP conference series: materials science and engineering, 2020, vol. 981, no. 4: IOP Publishing, p. 042002.
[38] P. C. Useche-Murillo, J. O. Pinzón-Arenas, and R. Jiménez-Moreno, "Obstacle Evasion Algorithm Using Convolutional Neural Networks and Kinect-V1," Indonesian Journal of Electrical Engineering and Informatics (IJEEI), vol. 8, no. 3, pp. 465-475, 2020.
[39] M. Hua, Y. Nan, and S. Lian, "Small obstacle avoidance based on RGB-D semantic segmentation," in Proceedings of the IEEE/CVF international conference on computer vision workshops, 2019, pp. 0-0.
[40] A. Saha, B. C. Dhara, S. Umer, K. Yurii, J. M. Alanazi, and A. A. AlZubi, "Efficient obstacle detection and tracking using RGB-D sensor data in dynamic environments for robotic applications," Sensors, vol. 22, no. 17, p. 6537, 2022.
[41] M. Sandler, A. Howard, M. Zhu, A. Zhmoginov, and L.-C. Chen, "Mobilenetv2: Inverted residuals and linear bottlenecks," in Proceedings of the IEEE conference on computer vision and pattern recognition, 2018, pp. 4510-4520.
[42] A. G. Howard et al., "Mobilenets: Efficient convolutional neural networks for mobile vision applications," arXiv preprint arXiv:1704.04861, 2017.
[43] T.-H. Lin, C.-T. Chang, and A. Putranto, "Tiny machine learning empowers climbing inspection robots for real-time multiobject bolt-defect detection," Engineering Applications of Artificial Intelligence, vol. 133, p. 108618, 2024.
[44] D. P. Kingma and J. Ba, "Adam: A method for stochastic optimization," arXiv preprint arXiv:1412.6980, 2014.
指導教授 林子軒(Tzu-Hsuan Lin) 審核日期 2024-7-30
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明