博碩士論文 111322079 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:28 、訪客IP:3.140.185.170
姓名 彭丞麒(Cheng-Qi Peng)  查詢紙本館藏   畢業系所 土木工程學系
論文名稱 以物聯網強化遙現機器人系統 與遠端環境互動之能力
(An IoT-Enhanced Telepresence System to Improve Interactivity with Remote Environments)
相關論文
★ 物聯網制動功能之互操作性解決方案★ 地理網路爬蟲:具擴充及擴展性之地理網路資源爬行架構
★ TDR監測資訊平台之改善與 感測器觀測服務之建立★ 利用高解析衛星立體像對產製近岸水底地形
★ 整合oneM2M 及OGC SensorThings API 標準建立開放式物聯網架構★ 巨量物聯網資料之多重屬性索引架構
★ 高效率異質性時序資料表示法辨別系統★ A TOA-reflectance-based Spatial-temporal Image Fusion Method for Aerosol Optical Depth Retrieval
★ An Automatic Embedded Device Registration Procedure for the OGC SensorThings API★ 基於本體論與使用者興趣之個人化地理網路搜尋引擎
★ 利用本體論整合城市模型及物聯網開放式標準探討智慧城市之應用★ 運用無人機及影像套合法進行混凝土橋梁裂縫檢測
★ GeoRank: A Geospatial Web Ranking Algorithm for a GeoWeb Search Engine★ 應用高時空解析度遙測影像融合於海水覆蓋率之監測
★ LoRaWAN Positioning based on Time Difference of Arrival and Differential Correction★ 類神經網路逆向工程理解遙測資訊:以Landsat 8植被分類為例
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 (2029-1-1以後開放)
摘要(中) 遙現的概念是提供遠方環境的感知訊息並賦予使用者與遠方環境互動的能力,讓使用者 感覺身歷其境,近年來遙現常以機器人作為人類代理人的方式進行呈現。目前常見的發展為 利用虛擬實境(Virtual Reality, VR)將使用者投射至機器人第一人稱視角並遠端控制其行動。然 而使用者與遠端環境互動的能力可能受限於機器人的設計與機構。隨著物聯網科技的發展, 嵌入式裝置透過資訊世界提供許多感測及致動功能,然而物聯網資源常透過電腦、手機等不 直覺的窗口式介面進行操作。本研究欲以遙現技術整合物聯網,擴展遙現機器人與遠端環境 互動的能力。確切而言,本研究在以虛擬實境作為人機界面之遙現機器人系統上,置入物聯 網之感測及致動資源進行取用。本研究的主要方法包含:(1)透過 360 ̊環景相機與虛擬實境裝 置達成遙現機器人與虛擬實境的連結;(2)透過 SLAM (Simultaneous Localization and Mapping) 以環景影像達成遙現機器人之定位定向,並以所萃取之 SLAM 模型與標註有物聯網資源之 BIM (Building Information Model)模型進行對位,建立機器人、虛擬實境、及 BIM 模型之坐標 轉換;(3)透過國際開放式標準 Open Geospatial Consortium (OGC) SensorThings API 網路服務 標準統一物聯網感測及致動資源之取用,於虛擬實境內對應位置呈現物聯網資源並達到更直 覺的資源調用操作。本研究提出的系統在 1920x960 的相機解析度與 Wide Area Network (WAN) 的網路環境下有約 800 毫秒的延遲。而物聯網資源投射至虛擬實境之定位受到內方位參數的 影響,偏差與像主點距離正相關,影像邊緣之最大偏差約在 10 公分內,評估於虛擬實境中視 覺感受影響不大。整體而言,本研究所提出的解決方案有效提升遙現系統對遠端環境之互動 性,使用者亦能以更直覺的方式取用物聯網資源。
摘要(英) The concept of telepresence is to provide users feeling of being exist at remote places and also provide users the ability to interact with the remote environment. In recent years, telepresence is often achieved using robots as agents for humans, and the integration with virtual reality (VR) technology can offer users an immersive audiovisual experience. The integration of VR and telepresence robots projects users into robots′ first-person perspective and enables remote control of its actions. However, the design and mechanisms of the robot usually limits users′ possible interactions with the remote environment. On the other hand, with the development of Internet of Things (IoT) technology, embedded devices provide various sensing and tasking resources through the digital realm. However, users usually access IoT resources through non-intuitive application interfaces on computers and smartphones. In order to address these issues in the telepresence systems and the IoT, this study aims to integrate the IoT and telepresence technologies, improving the interactivity with remote environments in an intuitive manner.
Specifically, this research designs and implements a telepresence robot system that employs VR as the human-machine interface, where IoT sensing and tasking resources are georeferenced and shown at corresponding positions on VR displays. The methodology of this study encompasses: (1) integrating a telepresence robot, a 360 ̊ panoramic camera, and a VR device in terms of video transmission and robot controls; (2) utilizing a Simultaneous Localization and Mapping (SLAM) algorithm with panoramic imagery for robot localization; (3) aligning the extracted 3D SLAM model with a Building Information Model (BIM) model annotated with IoT device locations in order to register coordinate systems of the robot, the VR displays, and the IoT resources; (4) leveraging the Open Geospatial Consortium (OGC) SensorThings API international open standard for interoperable connections to IoT sensing and tasking resources, which are presented within the VR environment for intuitive interactions. The system has a video delay around 800ms while running at the resolution of 1920x960 across Wide-Area-Network (WAN). When projecting IoT resources to VR displays, the positioning accuracy affected by lens distortion errors. While the distortions are proportional to the distances to the principle point, the largest distortions happened near the edges of image and were within 10cm, which did not cause significant issue when viewing in VR. Overall, the proposed solution improves telepresence experience in terms of the interactivity with remote environment in an immersive and intuitive manner.
關鍵字(中) ★ 遙現機器人
★ 虛擬實境
★ 物聯網
★ 同時定位與地圖建構
關鍵字(英) ★ Telepresence Robot
★ Virtual Reality
★ IoT
★ SLAM
論文目次 摘要 i
Abstract ii
Table of contents iii
List of Figures and Illustrations v
List of Tables vii
1.1 Background 1
1.2 Challenges 1
1.3 Relationship between Robotic Telepresence, VR, and IoT 3 1.4 Objectives 5
2. Related work 6
2.1 Telepresence 6
2.2.1 Telepresence via VR 6
2.2 Positioning of telepresence robots 9
2.2.1 Network-based positioning approaches 9
2.2.2 Sensor-based positioning approach 10
2.2.3 Visual-based positioning approach 10
2.3 IoT 11
2.3.1 IoT standards 12
2.3.2 Integration of IoT device and Human Computer Interface technologies 12
2.3.3 Summary 15
3. Methodology 16
3.1 The overall workflow 16
3.2 Hardware and software components of the telepresence system 18
3.2.1 The remote-controlled robot 18
3.2.2 The VR device 18
3.2.3 The 360 ̊ camera 19
3.2.4 Unity 21
3.3 Registration of coordinate systems 21
3.3.1 Relationship between coordinate systems 23
3.4 Accessing IoT resources 27
3.4.1 IoT standard 27
3.4.2 Interaction with IoT devices 29
4. Results and Evaluations 31
4.1 The constructed telepresence robot and operations 31
4.1.1 The telepresence robot 31
4.1.2 User’s view 32
4.1.3 Positioning of the robot 33
4.1.4 Positioning of IoT resources 34
4.1.5 Interaction with IoT devices 35
4.2.1 Positioning accuracy of IoT representations 40
4.2.2 Evaluation of video delay in the telepresence system 47
5. Conclusions and Future Work 51
References 52
參考文獻 1. Alatise, M. B., & Hancke, G. P. (2017). Pose estimation of a mobile robot based on fusion of IMU data and vision data using an extended Kalman filter. Sensors, 17(10), 2164.
2. Al-Ahmari, A. M., Abidi, M. H., Ahmad, A., & Darmoul, S. (2016). Development of a virtual manufacturing assembly simulation system. Advances in Mechanical Engineering, 8(3), 1687814016639824.
3. Blanco-Novoa, Ó., Fraga-Lamas, P., A Vilar-Montesinos, M., & Fernández-Caramés, T. M. (2020). Creating the Internet of Augmented Things: An Open-Source Framework to Make IoT Devices and Augmented and Mixed Reality Systems Talk to Each Other. Sensors, 20(11), 3328.
4. Campos, C., Elvira, R., Rodríguez, J. J. G., Montiel, J. M., & Tardós, J. D. (2021). Orb-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam. IEEE Transactions on Robotics, 37(6), 1874-1890.
5. Cho, B. S., Moon, W. S., Seo, W. J., & Baek, K. R. (2011). A dead reckoning localization system for mobile robots using inertial sensors and wheel revolution encoding. Journal of mechanical science and technology, 25, 2907-2917.
6. Davison, A. J., Reid, I. D., Molton, N. D., & Stasse, O. (2007). MonoSLAM: Real-time single camera SLAM. IEEE transactions on pattern analysis and machine intelligence, 29(6), 1052-1067.
7. Du, J., Do, H. M., & Sheng, W. (2021). Human–robot collaborative control in a virtual-reality- based telepresence system. International Journal of Social Robotics, 13, 1295-1306.
8. Durrant-Whyte, H., & Bailey, T. (2006). Simultaneous localization and mapping: part I. IEEE robotics & automation magazine, 13(2), 99-110.
9. Gaemperle, L., Seyid, K., Popovic, V., & Leblebici, Y. (2014). An immersive telepresence system using a real-time omnidirectional camera and a virtual reality head-mounted display. In 2014 IEEE International Symposium on Multimedia (pp. 175-178). IEEE.
10. Goronzy, G., Pelka, M., & Hellbrück, H. (2016, October). QRPos: Indoor positioning system for self-balancing robots based on QR codes. In 2016 International Conference on Indoor Positioning and Indoor Navigation (IPIN) (pp. 1-8). IEEE.
11. Hill, R., Madden, C., Van Den Hengel, A., Detmold, H., & Dick, A. (2009, December). Measuring latency for video surveillance systems. In 2009 digital image computing: techniques and applications (pp. 89-95). IEEE.
12. Huang, C. Y., Chiang, Y. H., & Tsai, F. (2022). An ontology integrating the open standards of city models and Internet of things for smart-city applications. IEEE Internet of Things Journal, 9(20), 20444-20457.
13. Izumihara, A., Uriu, D., Hiyama, A., & Inami, M. (2019). ExLeap: Minimal and highly available telepresence system creating leaping experience. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 1321-1322). IEEE.
14. Isabet, B., Pino, M., Lewis, M., Benveniste, S., & Rigaud, A. S. (2021). Social telepresence robots: a narrative review of experiments involving older adults before and during the COVID-19 pandemic. International Journal of Environmental Research and Public Health, 18(7), 3597.
15. Jo, D., & Kim, G. J. (2016). ARIoT: scalable augmented reality framework for interacting with Internet of Things appliances everywhere. IEEE Transactions on Consumer Electronics, 62(3), 334-340.
16. Ke, S., Xiang, F., Zhang, Z., & Zuo, Y. (2019). A enhanced interaction framework based on VR, AR and MR in digital twin. Procedia Cirp, 83, 753-758.
17. Klein, G., & Murray, D. (2007, November). Parallel tracking and mapping for small AR workspaces. In 2007 6th IEEE and ACM international symposium on mixed and augmented reality (pp. 225-234). IEEE.
18. Krishnan, S., Sharma, P., Guoping, Z., & Woon, O. H. (2007). A UWB based localization system for indoor robot navigation. In 2007 IEEE International Conference on Ultra-Wideband (pp. 77- 82). IEEE.
19. Lee, M. K., & Takayama, L. (2011). " Now, i have a body" uses and social norms for mobile remote presence in the workplace. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 33-42).
20. Lei, M., Clemente, I. M., Liu, H., & Bell, J. (2022). The acceptance of telepresence robots in higher education. International Journal of Social Robotics, 14(4), 1025-1042.ISO 690
21. Li, M. G., Zhu, H., You, S. Z., & Tang, C. Q. (2020). UWB-based localization system aided with inertial sensor for underground coal mine applications. IEEE Sensors Journal, 20(12), 6652-6669.
22. Liang, S., Huang, C. Y., & Khalafbeigi, T. (2016). OGC SensorThings API Part 1: Sensing,
Version 1.0.
23. Liang, S., & Khalafbeigi, T. (2019). OGC SensorThings API Part 2–Tasking Core, Version 1.0.
24. Lipton, J. I., Fay, A. J., & Rus, D. (2017). Baxter′s homunculus: Virtual reality spaces for
teleoperation in manufacturing. IEEE Robotics and Automation Letters, 3(1), 179-186.
25. Martinez-Hernandez, U., Boorman, L. W., & Prescott, T. J. (2017). Multisensory wearable
interface for immersion and telepresence in robotics. IEEE Sensors Journal, 17(8), 2534-2541.
26. Mur-Artal, R., Montiel, J. M. M., & Tardos, J. D. (2015). ORB-SLAM: a versatile and accurate
monocular SLAM system. IEEE transactions on robotics, 31(5), 1147-1163.
27. Mur-Artal, R., & Tardós, J. D. (2017). Orb-slam2: An open-source slam system for monocular,
stereo, and rgb-d cameras. IEEE transactions on robotics, 33(5), 1255-1262.
28. Oh, Y., Parasuraman, R., McGraw, T., & Min, B. C. (2018, March). 360 VR based robot
teleoperation interface for virtual tour. In Proceedings of the 1st International Workshop on Virtual,
Augmented, and Mixed Reality for HRI (VAM-HRI).
29. Păvăloiu, I. B., Vasilățeanu, A., Popa, R., Scurtu, D., Hang, A., & Goga, N. (2021). Healthcare
robotic telepresence. In 2021 13th International Conference on Electronics, Computers and
Artificial Intelligence (ECAI) (pp. 1-6). IEEE..
30. Park, J., Cho, Y. , and Martinez, D. (2016). A BIM and UWB integrated Mobile Robot Navigation
System for Indoor Position Tracking Applications." Journal of Construction Engineering and
Project Management, 10.6106/JCEPM.2016.6.2.030
31. Park, Y., Yun, S., & Kim, K. H. (232. Schulzrinne, H., Rao, A., & Lanphier, R. (1998). Real time streaming protocol (RTSP) (No. rfc2326).
33. Sector, I. T. S. (2012). Overview of the Internet of Things. ITU-T: Geneva, Switzerland.
34. Simiscuka, A. A., Markande, T. M., & Muntean, G. M. (2019). Real-virtual world device synchronization in a cloud-enabled social virtual reality IoT network. IEEE Access, 7, 106588-
106599.
35. Soldatos, J., Kefalakis, N., Hauswirth, M., Serrano, M., Calbimonte, J. P., Riahi, M., ... & Herzog,
R. (2015). Openiot: Open source internet-of-things in the cloud. In Interoperability and Open- Source Solutions for the Internet of Things: International Workshop, FP7 OpenIoT Project, Held in Conjunction with SoftCOM 2014, Split, Croatia, September 18, 2014, Invited Papers (pp. 13- 25). Springer International Publishing.
36. Steuer, J., Biocca, F., & Levy, M. R. (1995). Defining virtual reality: Dimensions determining telepresence. Communication in the age of virtual reality, 33, 37-39.
37. Sun, Y., Armengol-Urpi, A., Kantareddy, S. N. R., Siegel, J., & Sarma, S. (2019). Magichand: Interact with iot devices in augmented reality environment. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 1738-1743). IEEE.
38. Song, Y., Guan, M., Tay, W. P., Law, C. L., & Wen, C. (2019). Uwb/lidar fusion for cooperative range-only slam. In 2019 international conference on robotics and automation (ICRA) (pp. 6568- 6574). IEEE.
39. Stotko, P., Krumpen, S., Schwarz, M., Lenz, C., Behnke, S., Klein, R., & Weinmann, M. (2019). A VR system for immersive teleoperation and live exploration with a mobile robot. In 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 3630-3637). IEEE.
40. Sun, D., Kiselev, A., Liao, Q., Stoyanov, T., & Loutfi, A. (2020). A new mixed-reality-based teleoperation system for telepresence and maneuverability enhancement. IEEE Transactions on Human-Machine Systems, 50(1), 55-67.
41. Uitto, M., & Heikkinen, A. (2022). Evaluating 5G uplink performance in low latency video streaming. In 2022 Joint European Conference on Networks and Communications & 6G Summit (EuCNC/6G Summit) (pp. 393-398). IEEE.
42. Wang, H., Zhang, X., Chen, H., Xu, Y., & Ma, Z. (2021). Inferring end-to-end latency in live videos. IEEE Transactions on Broadcasting, 68(2), 517-529.
43. Yao, L., Wu, Y. W. A., Yao, L., & Liao, Z. Z. (2017). An integrated IMU and UWB sensor based indoor positioning system. In 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN) (pp. 1-8). IEEE.Youssef, K., Said, S., Al Kork, S., & Beyrouthy, T. (2023). Telepresence in the Recent Literature with a Focus on Robotic Platforms, Applications and Challenges. Robotics, 12(4), 111.019). When IoT met augmented reality: Visualizing the source of the wireless signal in AR view. In Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services (pp. 117-129).
44. Zhang, J., Langbehn, E., Krupke, D., Katzakis, N., & Steinicke, F. (2018). A 360 video-based robot platform for telepresent redirected walking. In Proceedings of the 1st International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interactions (VAM- HRI) (pp. 58-62).
45. Zhang, H., Zhang, C., Yang, W., & Chen, C. Y. (2015). Localization and navigation using QR code for mobile robot in indoor environment. In 2015 IEEE International conference on robotics and biomimetics (ROBIO) (pp. 2501-2506). IEEE.
46.oneM2M-TS-0001, oneM2M Functional Architecture Specification v2.10.0, Aug. 2016, http://www.onem2m.org/.
指導教授 黃智遠(Chih-Yuan Huang) 審核日期 2024-1-17
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明