博碩士論文 112322034 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:31 、訪客IP:3.133.133.189
姓名 陳立(Li Chen)  查詢紙本館藏   畢業系所 土木工程學系
論文名稱 基於ROS的遠端自動多螺栓 檢測機器人系統開發
(The Development of a Remote Automated Multi-Bolt Inspection Robot System Based on ROS)
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 (2026-7-27以後開放)
摘要(中) 本研究提出了一種基於ROS架構,結合YOLO視覺辨識模型與微型機器學習的遠端自動多螺栓檢測機器人系統,名為RAMBIRobot。該系統旨在解決目前鋼結構螺栓檢測方法中存在的人工高依賴度、效率低和安全性不足等問題。RAMBIRobot系統利用機械手臂結合YOLO視覺辨識技術自動檢測並定位螺栓位置,再通過微型敲擊裝置進行音訊辨識,判斷螺栓的鬆緊程度。該系統採用了高效的YOLOv5s模型,能在各種光照條件下準確識別螺栓位置,並且與深度攝影機結合,獲取螺栓的三維空間座標。音訊辨識部分使用了梅爾頻率倒譜係數(MFCC)進行音訊特徵提取,並利用二維卷積神經網路(2D CNN)進行音訊訊號的分析,從而實現對螺栓鬆緊狀態的精確判斷。
為了驗證系統的有效性,本研究進行了多組實驗,包括音訊辨識單元測試、視覺辨識模組測試以及整體系統的整合測試。實驗結果顯示,在不同檢測角度下,音訊辨識單元的總體準確率達到了0.793(90度)、0.711(67.5度)和0.711(45度),其中在90度檢測角度下的鬆動螺栓辨識精確度達到0.902。視覺辨識單元在不同光照條件下的檢測準確率均超過90%,展示了其在多種環境下穩定工作的能力。整體系統的整合測試顯示,RAMBIRobot能夠在遠程操作下高效完成螺栓的定位和檢測,總體檢測精度高達75%。
本研究的創新之處在於將ROS架構、YOLO視覺辨識與音訊分析技術相結合,實現了對螺栓的高效、自動化檢測。這不僅提高了檢測精度,還減少了對專業技術人員的依賴,降低了人力成本。同時,系統設計考慮了現場應用的便利性和靈活性,能夠適應不同的檢測場景和環境。未來,該系統還可進一步擴展應用於其他類型的結構健康監測,具有廣泛的發展潛力和應用價值。
摘要(英) This study presents RAMBIRobot, a remote automated multi-target bolt inspection system for steel structures, integrating the YOLO visual recognition model and tiny machine learning under the ROS framework. RAMBIRobot addresses current challenges in bolt inspection, such as high manual labor dependency, low efficiency, and safety concerns. The system uses a robotic arm with YOLO visual recognition to detect and locate bolts, followed by an audio recognition module with a miniature striking device to determine bolt tightness. The efficient YOLOv5s model accurately identifies bolt positions under various lighting conditions, obtaining 3D spatial coordinates with a depth camera. The audio recognition component employs Mel-Frequency Cepstral Coefficients (MFCC) and a 2D Convolutional Neural Network (2D CNN) for precise bolt tightness determination.
Experiments validated the system′s effectiveness, showing overall audio recognition accuracy of 0.793 (90 degrees), 0.711 (67.5 degrees), and 0.711 (45 degrees), with a loosened bolt detection precision of 0.902 at 90 degrees. The visual recognition module maintained detection accuracy above 90% under various lighting conditions. Integrated system testing indicated RAMBIRobot′s efficiency in remote bolt location and detection, with an overall detection accuracy of 75%.
This study′s innovation lies in integrating the ROS framework, YOLO visual recognition, and audio analysis technologies for efficient, automated bolt detection, reducing reliance on specialized personnel and lowering labor costs. The system′s design ensures ease of application and flexibility in various inspection scenarios. Future expansions could include other types of structural health monitoring, showcasing significant development potential.
關鍵字(中) ★ 螺栓缺陷檢測
★ 物件檢測
★ 微型機器學習
★ ROS架構
★ 機械手臂
★ 結構健康監測
關鍵字(英) ★ Bolt defect detection
★ YOLO
★ tiny machine learning
★ ROS
★ robotic arm
★ structural health monitoring
論文目次 目錄
摘要 i
Abstract ii
致謝 iii
目錄 iv
圖目錄 vii
表目錄 ix
一、 緒論 1
1-1 研究動機 1
1-2 研究目的 1
1-3 論文架構 2
二、 文獻回顧 3
2-1 螺栓鬆脫檢測相關研究及技術 3
2-2 機械手臂與機器人之相關檢測應用與研究 6
2-3 YOLO視覺辨識模型相關應用 7
2-4 過去文獻總結 9
三、 研究方法 11
3-1 RAMBIRobot系統架構 11
3-1-1 系統架構與運作流程 11
3-1-2 系統硬體架構與控制 12
3-2 ROS機器人作業系統 22
3-2-1 ROS核心技術 22
3-2-2 ROS架構組成 22
3-2-3 系統軟體架構 23
3-3 螺栓鬆緊音訊檢測方法 32
3-3-1 音訊訊號處理 32
3-3-2 二維卷積神經網路(2D CNN) 36
3-3-3 音訊辨識模型資料收集與訓練 38
3-4 視覺辨識方法 41
3-4-1 YOLOv5s視覺辨識模型 42
3-4-2 混淆矩陣 43
3-4-3 視覺辨識模型資料收集與訓練 45
3-4-4 像素座標轉換至相機座標 50
3-5 機器手臂運動學與手眼校正 53
3-5-1 順向運動學 53
3-5-2 逆向運動學 54
3-5-3 機器手臂手眼校正 55
四、 實驗設計與規劃 58
4-1 音訊辨識單元測試實驗 59
4-2 視覺辨識單元測試實驗 62
4-2-1 YOLOv5s視覺辨識模型測試實驗 62
4-2-2 螺栓空間座標輸出驗證實驗 63
4-3 自動控制與手動控制比較實驗 65
4-4 RAMBIRobot系統整合測試實驗 66
五、 系統驗證與結果討論 70
5-1 音訊辨識單元測試實驗結果 70
與鋼板夾角90〫之辨識結果 70
與鋼板夾角67.5〫之辨識結果 71
與鋼板夾角45〫之辨識結果 73
結果討論 74
5-2 視覺辨識單元測試實驗結果 74
5-2-1 YOLOv5s視覺辨識模型測試實驗結果 74
5-2-2 螺栓空間座標輸出驗證實驗結果 77
5-3 自動控制與手動控制檢測比較實驗結果 79
5-4 RAMBIRobot系統整合測試實驗結果 80
5-5 討論與結果比較 83
六、 結論與未來展望 86
6-1 結論 86
6-2 未來展望 86
參考文獻 88
參考文獻 參考文獻
[1] Y. Zhang, X. Sun, K. J. Loh, W. Su, Z. Xue, and X. Zhao, "Autonomous bolt loosening detection using deep learning," Structural Health Monitoring, vol. 19, no. 1, pp. 105-122, 2019, doi: 10.1177/1475921719837509.
[2] X. Kong and J. Li, "Image Registration-Based Bolt Loosening Detection of Steel Joints," Sensors (Basel), vol. 18, no. 4, Mar 28 2018, doi: 10.3390/s18041000.
[3] T.-C. Huynh, J.-H. Park, H.-J. Jung, and J.-T. Kim, "Quasi-autonomous bolt-loosening detection method using vision-based deep learning and image processing," Automation in Construction, vol. 105, 2019, doi: 10.1016/j.autcon.2019.102844.
[4] H. C. Pham, Q. B. Ta, J. T. Kim, D. D. Ho, X. L. Tran, and T. C. Huynh, "Bolt-Loosening Monitoring Framework Using an Image-Based Deep Learning and Graphical Model," Sensors (Basel), vol. 20, no. 12, Jun 15 2020, doi: 10.3390/s20123382.
[5] Y. Zhang, X. Zhao, X. Sun, W. Su, and Z. Xue, "Bolt loosening detection based on audio classification," Advances in Structural Engineering, vol. 22, no. 13, pp. 2882-2891, 2019, doi: 10.1177/1369433219852565.
[6] F. Wang, S. C. M. Ho, and G. Song, "Modeling and analysis of an impact-acoustic method for bolt looseness identification," Mechanical Systems and Signal Processing, vol. 133, 2019, doi: 10.1016/j.ymssp.2019.106249.
[7] F. Wang and G. Song, "Bolt-looseness detection by a new percussion-based method using multifractal analysis and gradient boosting decision tree," Structural Health Monitoring, vol. 19, no. 6, pp. 2023-2032, 2020, doi: 10.1177/1475921720912780.
[8] O. Eraliev, K. H. Lee, and C. H. Lee, "Vibration-Based Loosening Detection of a Multi-Bolt Structure Using Machine Learning Algorithms," Sensors (Basel), vol. 22, no. 3, Feb 5 2022, doi: 10.3390/s22031210.
[9] F. Wang and G. Song, "Bolt early looseness monitoring using modified vibro-acoustic modulation by time-reversal," Mechanical Systems and Signal Processing, vol. 130, pp. 349-360, 2019, doi: 10.1016/j.ymssp.2019.04.036.
[10] F. Wang, S. C. M. Ho, L. Huo, and G. Song, "A Novel Fractal Contact-Electromechanical Impedance Model for Quantitative Monitoring of Bolted Joint Looseness," IEEE Access, vol. 6, pp. 40212-40220, 2018, doi: 10.1109/access.2018.2855693.
[11] F. Wang, A. Mobiny, H. Van Nguyen, and G. Song, "If structure can exclaim: a novel robotic-assisted percussion method for spatial bolt-ball joint looseness detection," Structural Health Monitoring, vol. 20, no. 4, pp. 1597-1608, 2021.
[12] E. Hoxha, J. Feng, D. Sanakov, and J. Xiao, "Robotic Inspection and Subsurface Defect Mapping Using Impact-echo and Ground Penetrating Radar," IEEE Robotics and Automation Letters, 2023.
[13] H. Luo, C. Meng, and C. Li, "Design of a remote-controllable pipeline inspection robot with a robotic arm," in Seventh International Conference on Electromechanical Control Technology and Transportation (ICECTT 2022), 2022, vol. 12302: SPIE, pp. 535-540.
[14] G. Alnowaini, A. Alttal, and A. Alhaj, "Design and simulation robotic arm with computer vision for inspection process," in 2021 International Conference of Technology, Science and Administration (ICTSA), 2021: IEEE, pp. 1-6.
[15] C. Mineo, D. Herbert, M. Morozov, S. Pierce, P. Nicholson, and I. Cooper, "Robotic non-destructive inspection," in 51st Annual Conference of the British Institute of Non-Destructive Testing, 2012, pp. 345-352.
[16] S. Discepolo et al., "A robot-based inspecting system for 3D measurement," in 2023 IEEE International Workshop on Metrology for Industry 4.0 & IoT (MetroInd4. 0&IoT), 2023: IEEE, pp. 136-141.
[17] H. Cho, S. Choi, and C. J. Lissenden, "Nondestructive Inspection Results From Mockups of Spent Nuclear Fuel Storage Canisters Using Shear-Horizontal Waves Generated by an Electromagnetic Acoustic Transducer," Journal of Nondestructive Evaluation, Diagnostics and Prognostics of Engineering Systems, vol. 3, no. 2, p. 021001, 2020.
[18] P. Magalhaes and N. Ferreira, "Inspection application in an industrial environment with collaborative robots," Automation, vol. 3, no. 2, pp. 258-268, 2022.
[19] H. Rowshandel, G. Nicholson, C. Davis, and C. Roberts, "A robotic system for non-destructive evaluation of RCF cracks in rails using an ACFM sensor," in 5th IET Conference on Railway Condition Monitoring and Non-Destructive Testing (RCM 2011), 2011: IET, pp. 1-6.
[20] P. Zhang, J. Wang, F. Zhang, P. Xu, L. Li, and B. Li, "Design and analysis of welding inspection robot," Scientific Reports, vol. 12, no. 1, p. 22651, 2022.
[21] B. Yan, P. Fan, X. Lei, Z. Liu, and F. Yang, "A real-time apple targets detection method for picking robot based on improved YOLOv5," Remote Sensing, vol. 13, no. 9, p. 1619, 2021.
[22] M. Lippi, N. Bonucci, R. F. Carpio, M. Contarini, S. Speranza, and A. Gasparri, "A yolo-based pest detection system for precision agriculture," in 2021 29th Mediterranean Conference on Control and Automation (MED), 2021: IEEE, pp. 342-347.
[23] X. Li, Q. Liu, T. Liu, and J. Wang, "Research on YOLO Model and Its Application in Fault Status Recognition of Freight Trains," in Advances in Artificial Intelligence and Security: 7th International Conference, ICAIS 2021, Dublin, Ireland, July 19-23, 2021, Proceedings, Part I 7, 2021: Springer, pp. 144-156.
[24] M. A. B. Zuraimi and F. H. K. Zaman, "Vehicle detection and tracking using YOLO and DeepSORT," in 2021 IEEE 11th IEEE Symposium on Computer Applications & Industrial Electronics (ISCAIE), 2021: IEEE, pp. 23-29.
[25] C. L. KR, B. Praveena, G. Sahaana, T. Gnanasekaran, and M. Hashim, "Yolo for Detecting Plant Diseases," in 2023 Third International Conference on Artificial Intelligence and Smart Energy (ICAIS), 2023: IEEE, pp. 1029-1034.
[26] A. K. Lailesh, J. A. Richi, and N. Preethi, "A Pre-trained YOLO-v5 model and an Image Subtraction Approach for Printed Circuit Board Defect Detection," in 2023 International Conference on Intelligent and Innovative Technologies in Computing, Electrical and Electronics (IITCEE), 2023: IEEE, pp. 140-145.
[27] R. Manuel, B. Ioana, and A. Gavrilas, "Application for threat surveillance using Python and Yolo-V7," in 2023 22nd International Symposium INFOTEH-JAHORINA (INFOTEH), 2023: IEEE, pp. 1-5.
[28] Y. Zheng, S. Wu, D. Liu, R. Wei, S. Li, and Z. Tu, "Sleeper defect detection based on improved YOLO V3 algorithm," in 2020 15th IEEE Conference on Industrial Electronics and Applications (ICIEA), 2020: IEEE, pp. 955-960.
[29] S. Kuan-Ying, C. Ming-Fei, T. Po-Cheng, and T. Cheng-Han, "Establish a Dynamic Detection System for Metal Bicycle Frame Defects Based on YOLO Object Detection," in 2022 IET International Conference on Engineering Technologies and Applications (IET-ICETA), 2022: IEEE, pp. 1-2.
[30] A. Sonavane and R. Kohar, "Dental cavity detection using yolo," in Proceedings of Data Analytics and Management: ICDAM 2021, Volume 2, 2022: Springer, pp. 141-152.
[31] Y. Sun, M. Li, R. Dong, W. Chen, and D. Jiang, "Vision-based detection of bolt loosening using YOLOv5," Sensors, vol. 22, no. 14, p. 5184, 2022.
[32] D. Wang, M. Zhang, D. Sheng, and W. Chen, "Bolt positioning detection based on improved YOLOv5 for bridge structural health monitoring," Sensors, vol. 23, no. 1, p. 396, 2022.
[33] K. Zhao, Y. Wang, Y. Zuo, and C. Zhang, "Palletizing robot positioning bolt detection based on improved YOLO-V3," Journal of Intelligent & Robotic Systems, vol. 104, no. 3, p. 41, 2022.
[34] Q. Lu, Y. Jing, and X. Zhao, "Bolt loosening detection using key-point detection enhanced by synthetic datasets," Applied Sciences, vol. 13, no. 3, p. 2020, 2023.
[35] Z. Liu and H. Lv, "YOLO_Bolt: a lightweight network model for bolt detection," Scientific Reports, vol. 14, no. 1, p. 656, 2024.
[36] M. Quigley et al., "ROS: an open-source Robot Operating System," in ICRA workshop on open source software, 2009, vol. 3, no. 3.2: Kobe, Japan, p. 5.
[37] J. Terven, D.-M. Córdova-Esparza, and J.-A. Romero-González, "A comprehensive review of yolo architectures in computer vision: From yolov1 to yolov8 and yolo-nas," Machine Learning and Knowledge Extraction, vol. 5, no. 4, pp. 1680-1716, 2023.
[38] J. Wang et al., "Research on improved yolov5 for low-light environment object detection," Electronics, vol. 12, no. 14, p. 3089, 2023.
[39] M. Jani, J. Fayyad, Y. Al-Younes, and H. Najjaran, "Model compression methods for YOLOv5: A review," arXiv preprint arXiv:2307.11904, 2023.
[40] Z. Zhang, "A flexible new technique for camera calibration," IEEE Transactions on pattern analysis and machine intelligence, vol. 22, no. 11, pp. 1330-1334, 2000.
[41] J. Heikkila and O. Silvén, "A four-step camera calibration procedure with implicit image correction," in Proceedings of IEEE computer society conference on computer vision and pattern recognition, 1997: IEEE, pp. 1106-1112.
[42] J. J. Craig, "Introduction to Robotics," 2005.
[43] J. Denavit and R. S. Hartenberg, "A kinematic notation for lower-pair mechanisms based on matrices," 1955.
[44] K. H. Strobl and G. Hirzinger, "Optimal hand-eye calibration," in 2006 IEEE/RSJ international conference on intelligent robots and systems, 2006: IEEE, pp. 4647-4653.
指導教授 林子軒 林子軒(Tzu-Hsuan Lin Tzu-Hsuan Lin) 審核日期 2024-7-29
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明