English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 78937/78937 (100%)
造訪人次 : 39151749      線上人數 : 628
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋


    請使用永久網址來引用或連結此文件: http://ir.lib.ncu.edu.tw/handle/987654321/90851


    題名: 羽球自動收集與檢測之智慧機器人
    作者: 陳冠霖;Chen, Kuan-Lin
    貢獻者: 電機工程學系
    關鍵詞: 羽球偵測;羽毛完整度辨識分析;六軸機械手臂;座標轉換;運動學;ROS;無人自走車;shuttlecock;Automated Guided Vehicle;robot operating system;coordinate transformation;6 DOF robotic arm
    日期: 2022-11-21
    上傳時間: 2023-05-09 18:10:30 (UTC+8)
    出版者: 國立中央大學
    摘要: 本論文旨在設計一個羽球自動收集與檢測系統,此系統使用無人搬運車透過影像辨識將羽球場上的羽球撿取到車上載回到六軸機械手臂的位置附近基地,然後人工拿到機器手底下平台,再透過六軸機械手臂夾取羽球到攝影機辨識羽球的完好度,從而進行好壞球的分類。
    本論文之研究項目如下,在無人搬運車收集羽球的部分,透過裝置在無人搬運車上的網路攝影機完成以下三點:(1)使用深度學習偵測與辨識羽球,(2)透過相機針孔模型演算法計算出目標物品與相機之間的相對位置,移動無人車到目標物品位置,並透過ROS控制馬達將羽球收集到無人搬運車上,(3)使用AprilTag引導無人搬運車回到基地。在羽球影像方面,則透過安裝在機械手臂末端的深度攝影機以及檢測羽球使用的網路攝影機所輸出的影像完成以下四個工作:(1)使用深度學習網路偵測羽球球頭以及中心的位置以及角度,(2)計算出目標物品與相機之間的相對位置,(3)使用深度學習網路與影像處裡將羽球進行完好度的分析。另外在機械手臂的運動控制方面,完成以下程序。(1)建置虛擬環境,(2)計算機器手臂運作模型的轉換矩陣,(3)求得羽球球頭目標點的座標,並且以逆運動學控制機械手臂到羽球球頭的目標點。綜合上述條件,便可以讓無人車完成羽球收集與機械手臂可以完成夾取羽球分類的兩個任務。
    本研究在Linux環境下使用機器人作業系統(Robot Operating System, ROS)開發軟體系統,透過ROS分散式的架構與點對點網路,將所有資訊收集在一起並進行資料傳遞並整合,實現軟體硬體協同的設計。本論文在實際羽球場上實驗中,無人搬運車可以收集球場上羽球,機械手臂正確夾取率為92.1%,羽球分類正確率則為82%,成果顯示本論文確實能成功建立了一套撿拾並分類羽球的系統。
    ;The thesis aims to design a shuttlecock automatic collecting and checking system. At first, the shuttlecocks on the court are detected and picked up by the AGV (Automated Guided Vehicle) and then brought back to the base. After collecting all shuttlecocks, the six degrees of freedom (6DOF) robot arm picked up each and identified its integrity.
    The research topics of this thesis are described as follows: At the part of AGV collecting the shuttlecocks and through the monocular vision from the Web camera which is mounted on the AGV, we complete (1) detecting and identifying shuttlecocks, (2) calculating the relative position between the target object and the camera and (3) guiding the AGV back to the base based on the AprilTag recognition. At the part of the shuttlecocks image, based on the images from a depth camera installed at the end of the robotic arm and the webcam, we complete (1) using a deep learning technique to calculate the position and angle of the shuttlecock’s head and body center, (2) calculating the relative position between the shuttlecock and the camera, (3) measuring the integrity of the shuttlecock. In addition, the following procedures are required to be complete regarding motion control of the robotic arm. (1) build a virtual environment, (2) calculate the transformation matrix of the robot arm operation, (3) obtain the coordinates of the shuttlecock’s head, and control the robot arm to the target point of the shuttlecock head with inverse kinematics. After all, the AGV can complete shuttlecocks collection, and the robotic arm can complete shuttlecocks picking up and integrity checking.
    This thesis uses the Robot Operating System (ROS) to develop a software system in the Linux environment. Through the distributed architecture of ROS and the peer-to-peer network, all information is collected, transmitted, and integrated to achieve the design of software and hardware collaboration. In the experiment of this study on the actual badminton court, the AGV can collect all the shuttlecocks on the court, and the accuracy rates of the robot arm clamping and classification are 92.5% and 83%, respectively. It is concluded that the thesis establishes the system that can pick up the shuttlecocks on the court and identify the shuttlecock’s integrity.
    顯示於類別:[電機工程研究所] 博碩士論文

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    index.html0KbHTML125檢視/開啟


    在NCUIR中所有的資料項目都受到原著作權保護.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明