English  |  正體中文  |  简体中文  |  Items with full text/Total items : 78852/78852 (100%)
Visitors : 35422670      Online Users : 581
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version

    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/93320

    Title: 以物聯網強化遙現機器人系統 與遠端環境互動之能力;An IoT-Enhanced Telepresence System to Improve Interactivity with Remote Environments
    Authors: 彭丞麒;Peng, Cheng-Qi
    Contributors: 土木工程學系
    Keywords: 遙現機器人;虛擬實境;物聯網;同時定位與地圖建構;Telepresence Robot;Virtual Reality;IoT;SLAM
    Date: 2024-01-17
    Issue Date: 2024-03-05 16:22:10 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 遙現的概念是提供遠方環境的感知訊息並賦予使用者與遠方環境互動的能力,讓使用者 感覺身歷其境,近年來遙現常以機器人作為人類代理人的方式進行呈現。目前常見的發展為 利用虛擬實境(Virtual Reality, VR)將使用者投射至機器人第一人稱視角並遠端控制其行動。然 而使用者與遠端環境互動的能力可能受限於機器人的設計與機構。隨著物聯網科技的發展, 嵌入式裝置透過資訊世界提供許多感測及致動功能,然而物聯網資源常透過電腦、手機等不 直覺的窗口式介面進行操作。本研究欲以遙現技術整合物聯網,擴展遙現機器人與遠端環境 互動的能力。確切而言,本研究在以虛擬實境作為人機界面之遙現機器人系統上,置入物聯 網之感測及致動資源進行取用。本研究的主要方法包含:(1)透過 360 ̊環景相機與虛擬實境裝 置達成遙現機器人與虛擬實境的連結;(2)透過 SLAM (Simultaneous Localization and Mapping) 以環景影像達成遙現機器人之定位定向,並以所萃取之 SLAM 模型與標註有物聯網資源之 BIM (Building Information Model)模型進行對位,建立機器人、虛擬實境、及 BIM 模型之坐標 轉換;(3)透過國際開放式標準 Open Geospatial Consortium (OGC) SensorThings API 網路服務 標準統一物聯網感測及致動資源之取用,於虛擬實境內對應位置呈現物聯網資源並達到更直 覺的資源調用操作。本研究提出的系統在 1920x960 的相機解析度與 Wide Area Network (WAN) 的網路環境下有約 800 毫秒的延遲。而物聯網資源投射至虛擬實境之定位受到內方位參數的 影響,偏差與像主點距離正相關,影像邊緣之最大偏差約在 10 公分內,評估於虛擬實境中視 覺感受影響不大。整體而言,本研究所提出的解決方案有效提升遙現系統對遠端環境之互動 性,使用者亦能以更直覺的方式取用物聯網資源。;The concept of telepresence is to provide users feeling of being exist at remote places and also provide users the ability to interact with the remote environment. In recent years, telepresence is often achieved using robots as agents for humans, and the integration with virtual reality (VR) technology can offer users an immersive audiovisual experience. The integration of VR and telepresence robots projects users into robots′ first-person perspective and enables remote control of its actions. However, the design and mechanisms of the robot usually limits users′ possible interactions with the remote environment. On the other hand, with the development of Internet of Things (IoT) technology, embedded devices provide various sensing and tasking resources through the digital realm. However, users usually access IoT resources through non-intuitive application interfaces on computers and smartphones. In order to address these issues in the telepresence systems and the IoT, this study aims to integrate the IoT and telepresence technologies, improving the interactivity with remote environments in an intuitive manner.
    Specifically, this research designs and implements a telepresence robot system that employs VR as the human-machine interface, where IoT sensing and tasking resources are georeferenced and shown at corresponding positions on VR displays. The methodology of this study encompasses: (1) integrating a telepresence robot, a 360 ̊ panoramic camera, and a VR device in terms of video transmission and robot controls; (2) utilizing a Simultaneous Localization and Mapping (SLAM) algorithm with panoramic imagery for robot localization; (3) aligning the extracted 3D SLAM model with a Building Information Model (BIM) model annotated with IoT device locations in order to register coordinate systems of the robot, the VR displays, and the IoT resources; (4) leveraging the Open Geospatial Consortium (OGC) SensorThings API international open standard for interoperable connections to IoT sensing and tasking resources, which are presented within the VR environment for intuitive interactions. The system has a video delay around 800ms while running at the resolution of 1920x960 across Wide-Area-Network (WAN). When projecting IoT resources to VR displays, the positioning accuracy affected by lens distortion errors. While the distortions are proportional to the distances to the principle point, the largest distortions happened near the edges of image and were within 10cm, which did not cause significant issue when viewing in VR. Overall, the proposed solution improves telepresence experience in terms of the interactivity with remote environment in an immersive and intuitive manner.
    Appears in Collections:[土木工程研究所] 博碩士論文

    Files in This Item:

    File Description SizeFormat

    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明