中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/10348
English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 80990/80990 (100%)
造訪人次 : 41749456      線上人數 : 1942
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋


    請使用永久網址來引用或連結此文件: http://ir.lib.ncu.edu.tw/handle/987654321/10348


    題名: 互動雙足式機器人之設計與實現(III)互動演算法執行;The design and realization of interactive biped robots (III) execution of interactive algorithm for two robots
    作者: 余建良;Chien-Liang Yu
    貢獻者: 電機工程研究所
    關鍵詞: 機器人視覺;互動;雙足機器人;robot vision;interactive;biped robot
    日期: 2008-06-18
    上傳時間: 2009-09-22 12:13:21 (UTC+8)
    出版者: 國立中央大學圖書館
    摘要: 本研究「互動雙足式機器人之設計與實現」是由三位同學合力完成,共分為(A)手勢辨識、(B)雙足式機器人控制以及(C)互動演算法執行三個部分,而本論文是針對(C)項目做研究。 為設計與控制一對雙足式機器人完成互動合作的表演,並結合手勢辨識告知機器人該執行的任務,本研究中嘗試設計出不同的演算法,使機器人得以完成互動的目的。研究中的兩台機器人,分別命名為主導(Master)機器人與從屬(Slave)機器人,主導機器人的頭部裝有無線攝影機,具備視覺能力;從屬機器人無攝影機,但有裝置紅外線感測器,能量測機器人與物體之間的距離,兩機器人之間並具有無線通訊的能力。 在本論文中,主導機器人利用無線攝影機擷取欲搬運物體之影像資訊,並運用所設計之獨力搬運演算法辨識目標位置,獨力運送物體至指定目標;從屬機器人的身體上貼有不同顏色的色塊,可讓主導機器人辨識其位置並透過不同演算法,使從屬機器人接受其指揮共同完成互動動作包含:1.兩機器人互相走近並握手、2.兩機器人接力搬運物體至目的地放下、3.兩機器人合力搬運物體至目的地放下。機器人與物體的相對位置判定除了依賴主導機器人的機器視覺外,還需要搭配從屬機器人身上的紅外線感測器為輔助,可使兩機器人的定位更為準確。在互動過程中,利用機器視覺配合紅外線感測器再加上無線通訊,賦予機器人服從命令、自我修正以及合作搬運的能力是本論文的主要成就。 The study work “The design and realization of interactive biped robots” was completed by three members. Three members accomplish the following three tasks, respectively, (A) gesture recognition, (B) basic motions’ control of biped robots, and (C) execution of interactive algorithm for two robots. This thesis focuses on the part (C) execution of interactive algorithm for two robots. The goal of this research is to design and control a pair of biped robots such that the two robots can cooperate with each other. Humans can command the robots to do some motions by certain of hand gestures. For the mentioned two robots, one is called “Master” and the other is “Slave”. A wireless camera is installed on the top of Master robot as its eye. On the other hand, Slave robot has not vision capability without camera, but it can measure the distance between itself and the object to be carried by its infrared ray sensor. Furthermore, two robots are communicate each other by wireless communication. In this thesis, Master robot can use its camera to recognize the positions of the object to be carried and the desired destination, and then transport the object from the initial position to the destination. Based on the color marks pasted on Slave robot, Master robot can direct Slave robot to accomplish the following interactive motions. 1. Two robots walk close to each other and shake hands; 2. Slave robot passes the object to Master robot, then Master robot moves the object to the destination; and 3. Two robots carry the object together and transport it to the destination. During their interactive motions, the relative positions between two robots or between the robot and object depend on not only the camera of Master robot but also the infrared ray sensor of Slave robot, such that the interactive motions will be much more accurate. The major achievement of this thesis is that robot vision, infrared ray sensor and wireless communication are integrated to control the robots obeying instruction and working together.
    顯示於類別:[電機工程研究所] 博碩士論文

    文件中的檔案:

    檔案 大小格式瀏覽次數


    在NCUIR中所有的資料項目都受到原著作權保護.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明