博碩士論文 108522062 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:21 、訪客IP:3.16.81.94
姓名 李映儒(Ying-Ru Lee)  查詢紙本館藏   畢業系所 資訊工程學系
論文名稱 比較XRSPACE MANOVA中手勢和控制器互動模式的用戶體驗
(User Experience Comparison of Interaction Mode between Gesture and Controller in XRSPACE MANOVA)
相關論文
★ 從EEG解釋虛擬實境的干擾對注意力的影響★ 使用虛擬教室遊戲的基於融合的深度學習注意缺陷多動障礙評估方法
★ 利用分層共現網絡評估發展遲緩兒童的精細運動★ 太極大師:基於太極拳的注意力訓練遊戲, 使用動作辨識及平衡分析進行表現評估
★ 基於骨架步態藉由機器學習進行臨床老化衰落分類★ 用於注意力不足過動症診斷的可解釋多模態融合模型
★ 基於深度學習的虛擬現實腦震盪檢測與融合方法★ 在虛擬現實場景中利用多種生理資料進行高壓駕駛的壓力識別
★ 基於V模型、醫療器材標準和FDA指南的新醫療器材軟體開發流程:以ADHD虛擬實境教室為例
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 ( 永不開放)
摘要(中) 在這項研究中,為了了解使用者是否接受一種新型手勢辨識,這種手勢辨識不需要戴上任何感測器及設備,只需要頭盔的鏡頭對準手勢就能偵測手勢的動作及位置,並在虛擬世界直觀且自然的操作手勢,就像在現實生活中用手去做事一樣,實現模糊虛擬及真實之間界線的可行性。我們探討並分析使用者是否能接受像這樣一種接近現實的沉浸式體驗,因此,我們比較此種手勢以及手把對於使用者互動體驗和感知的影響。我們對大學生及研究生進行了2(控制模式)×2(性別)、2(控制模式)×2(類組)以及2(控制模式)×3(任務完成時間)的因子設計實驗。調查不同控制模式(手勢/手把)、性別、類組、任務完成模式對XRSPACE MANOVA使用者體驗的影響,並分析使用者主觀感知評估以及提供使用者反饋加以證明分析結果,我們採用成對樣本T test、One-way ANOVA分析、Wilcoxon signed rank test以及Kruskal-Wallis test分析顯手勢辨識/手把使用者體驗評估的主要影響和相互作用。結果表明,在體驗XRSPACE MANOVA 後,使用者的可接受性偏向於手把而不是手勢辨識,並事後詢問使用者不喜歡手勢辨識的原因,發現使用者認為手勢辨識的感應偵測功能上有很大的缺陷,如果優化手勢辨識的缺陷功能,很多使用者對手勢的接受性就會很高。
摘要(英) This research purpose understands the users whether to accept the new gesture recognition that advantage is no need another sensor or controller only the camera of helmet to detect the gesture, to achieve feasibility of between the fuzzy real and the virtual, so that it is natural to operate in the virtual world, just like in real life. The development of this gesture can change people’s lifestyles in future and even may replace traditional controllers, such as playing basketball or greeting and interacting with people. Therefore, we have explore gesture and controller affect the user’s interactive experience and perception. We examine the effects of different control modes (gestures/controllers), gender, majoring, and task completion time on the user′s experience of XRSPACE MANOVA, conduct subjective perception assessment of users, and provide empirical evidence of user experience. We use the paired sample T test, one way ANOVA test, Wilcoxon signed rank test and Kruskal-Wallis test analysis shows the main influence and interaction of gesture/controller user experience evaluation. The results show that, Short time use of gestures has good performance in internal control and behavioral intentions, and although in result not significant, in Perceived Enjoyment, (PE), Satisfaction (SAT), Confirmation (CON), with good performance evaluation, which means that users have positive fun and expectations for the acceptance of novel technologies. In the part of controller, in Perceived Usefulness (PU), Perceived Ease of Use (PEOU), Attitude to Use (ATU), Perceptions of Internal Control (POIC) have a good subjective assessment, which means, in terms of operating performance, users are more accepting of the controller. We have found in the experiment some question about negative situation of the gesture (It is also the advantage of the controller) that cause long-term use of gestures will reduce the evaluation of gestures then discuss how to transfer and improve the advantages of the controller to gestures, let user will feel the same as using the controller for operating functions and may have higher expectations of gesture acceptability.
關鍵字(中) ★ 沉浸式
★ 虛擬實境
★ 互動性
★ 接受度
關鍵字(英) ★ immersion
★ Virtual Reality (VR)
★ interactivity
★ acceptance
論文目次 Table of Contents
摘要........................................I
Abstract...................................II
致謝........................................IV
Table of Contents...........................V
List of Figures...........................VIII
List of Tables..............................X
1. Introduction........................1
2. Related Works.......................3
2.1 Gesture recognition compare controller..3
2.2 Experience evaluation...................4
3. Research Hypotheses.................5
4. Method..............................7
4.1 XRSPACE MANOVA device and Operation.7
4.1.1 Gesture.............................8
4.1.2Controller.............................8
4.2 Task Design.............................10
4.3 Experimental Design.....................11
4.4 Data Collection and Measurement of Variables ............................................12
4.5 Questionnaire design....................12
4.6 Number of Experimenters and Experiment Process ............................................13
5. Result..............................15
5.1 Perceived Usefulness (PU)...........15
5.2 Perceived Ease of Use (PEOU)........16
5.3 Attitude to Use (ATU)...............16
5.4 Perceptions of Internal Control (POIC)......................................17
5.5 Perceived Enjoyment (PE)............18
5.6 Confirmation (CON)..................18
5.7 Satisfaction (SAT)..................19
5.8 Behavioral Intention to Use (BIU)...19
6. Discussion..........................21
6.1 The feedback about subjective perception ............................................21
6.2 Some question about negative situation of the gesture then discuss how to transfer and improve the advantages of the controller to gestures....26
6.2.1 An inconsistency in cognition of gestures posture between designers and users.................26
6.2.2 The hand must follow the relative position of the helmet camera to detect and move limitations question. ............................................27
6.2.3 Gesture aim function problem........28
7. Conclusion..........................28
Reference...................................31
List of Figures
Fig. 1 Using controller (left) and using gesture (right) to operate the system...…7
Fig. 2 Basic gesture, Adapted from [29]…………………………………………8
Fig. 3 Controller button function, Adapted from [30]............................................9
Fig. 4 Controller basic button function, Adapted from [31].................................. 9
Fig. 5 Experiment process....................................................................................14
Fig. 6 (a) PU evaluation (b) PEOU evaluation (c) ATU evaluation (d) ATU evaluation about gesture task complete times (e) POIC evaluation (f) POIC evaluation about gesture task complete times (g) PE evaluation (h) CON evaluation (i) SAT evaluation (j) BIU evaluation for controller about gesture task complete times (k) BIU evaluation for gesture about gesture task complete times (The number of stars represents: *p<0.05, **p<0.01, ***p<0.001).......................20
Fig. 7 PU and PEOU user feedback of gesture (above) and controller (below)....22
Fig. 8 ATU user feedback of gesture (above) and controller (below)..................22
Fig. 9 POIC user feedback of gesture, the red mark about that users who complete task in 15minutes, the green mark about that users who complete task in 15-20minutes. (Above of the picture);POIC user feedback of controller. (Below of the picture)...........................................................................................................23
Fig. 10 PE user feedback of gesture (above) and controller (below)...................24
Fig. 11 SAT and CON user feedback of gesture..................................................25
Fig. 12 SAT and CON user feedback of gesture and controller............................25
List of Tables
Table 1. Group factorial design experiment of gender and gesture/controller 11
Table 2. Group factorial design experiment of gender and gesture/controller.... 11
Table 3. Group factorial design experiment of complete task time and gesture/controller................................................................................................. 12
Table 4. We have merge two table about question measurement of control mode. Display about the coefficient of reliability (Cronbach’s alpha) of controller (left), and the coefficient of reliability (Cronbach’s alpha) of gesture (right)...................................................................................................................13

參考文獻 Reference
[1] C. George, P. Tamunjoh and H. Hussmann, "Invisible Boundaries for VR: Auditory and Haptic Signals as Indicators for Real World Boundaries," in IEEE Transactions on Visualization and Computer Graphics, vol. 26, no. 12, pp. 3414-3422, Dec. 2020, doi: 10.1109/TVCG.2020.3023607.
[2] Chalmers, David J.. "The Virtual and the Real" Disputatio, vol.9, no.46, 2018, pp.309-352.
[3] Roberta Cozza, Anthony Mullen, Annette Jump and Tuong Nguyen, Predicts 2018: Immersive Technologies and Devices Will Transform Personal and Business Interactions, Dec 2017.
[4] D. Zhao, Y. Liu, Y. Wang and T. Liu, "Analyzing the Usability of Gesture Interaction in Virtual Driving System," 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 2019, pp. 1277-1278, doi: 10.1109/VR.2019.8797713.
[5] M. Meier, P. Streli, A. Fender and C. Holz, "Demonstrating the Use of Rapid Touch Interaction in Virtual Reality for Prolonged Interaction in Productivity Scenarios," 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 2021, pp. 761-762, doi: 10.1109/VRW52623.2021.00263.
[6] J. Schioppo, Z. Meyer, D. Fabiano and S. Canavan, "Sign Language Recognition in Virtual Reality," 2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020), 2020, pp. 917-917, doi: 10.1109/FG47880.2020.00027.
[7] C. -C. Tsai, C. -C. Kuo and Y. -L. Chen, "3D Hand Gesture Recognition for Drone Control in Unity*," 2020 IEEE 16th International Conference on Automation Science and Engineering (CASE),2020,pp. 985-988, doi: 10.1109/CASE48305.2020.9216807.
[8] Fred D Davis. User acceptance of information technology: system characteristics, user perceptions, and behavioral impacts. International Journal of Man-Machine Studies, 1993, 38 (3) : 475-487.
[9] Shanthakumar, V.A., Peng, C., Hansberger, J. et al. Design and evaluation of a hand gesture recognition approach for real-time interactions. Multimed Tools Appl 79, 17707–17730 (2020).
[10] J. Voigt-Antons, T. Kojic, D. Ali and S. Möller, "Influence of Hand Tracking as a Way of Interaction in Virtual Reality on User Experience," 2020 Twelfth International Conference on Quality of Multimedia Experience (QoMEX), 2020, pp. 1-4, doi: 10.1109/QoMEX48832.2020.9123085.
[11] Yang LI, Jin HUANG, Feng TIAN, Hong-An WANG, Guo-Zhong DAI,Gesture interaction in virtual reality,Virtual Reality & Intelligent Hardware,Volume 1, Issue 1,2019,Pages84-112,ISSN2096-5796,https://doi.org/10.3724/SP.J.2096-5796.2018.0006.
[12] FAHMI, F., et al. Comparison study of user experience between virtual reality controllers, leap motion controllers, and senso glove for anatomy learning systems in a virtual reality environment. In: IOP Conference Series: Materials Science and Engineering. IOP Publishing, 2020. p. 012024.
[13] NAVARRO, Diego; SUNDSTEDT, Veronica. Evaluating Player Performance and Experience in Virtual Reality Game Interactions using the HTC Vive Controller and Leap Motion Sensor. In: VISIGRAPP (2: HUCAPP). 2019. p. 103-110.
[14] A. Onishi, S. Nishiguchi, Y. Mizutani and W. Hashimoto, "A Study of Usability Improvement in Immersive VR Programming Environment," 2019 International Conference on Cyberworlds (CW), 2019, pp. 384-386, doi: 10.1109/CW.2019.00073.
[15] M. E. Latoschik, F. Kern, J. -P. Stauffert, A. Bartl, M. Botsch and J. -L. Lugrin, "Not Alone Here?! Scalability and User Experience of Embodied Ambient Crowds in Distributed Social Virtual Reality," in IEEE Transactions on Visualization and Computer Graphics, vol. 25, no. 5, pp. 2134-2144, May 2019, doi: 10.1109/TVCG.2019.2899250.
[16] L. Lin et al., "The Effect of Hand Size and Interaction Modality on the Virtual Hand Illusion," 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), 2019, pp. 510-518, doi: 10.1109/VR.2019.8797787.
[17] S. Kim, A. Jing, H. Park, G. A. Lee, W. Huang and M. Billinghurst, "Hand-in-Air (HiA) and Hand-on-Target (HoT) Style Gesture Cues for Mixed Reality Collaboration," in IEEE Access, vol. 8, pp. 224145-224161, 2020, doi: 10.1109/ACCESS.2020.3043783.
[18] C. Chang, J. Heo, S. Yeh, H. Han and M. Li, "The Effects of Immersion and Interactivity on College Students’ Acceptance of a Novel VR-Supported Educational Technology for Mental Rotation," in IEEE Access, vol. 6, pp. 66590-66599, 2018, doi: 10.1109/ACCESS.2018.2878270.
[19] F. D. Davis, R. P. Bagozzi, and P. R. Warshaw, ‘‘User acceptance of computer technology: A comparison of two theoretical models,’’ Manage. Sci., vol. 35, pp. 982–1003, Aug. 1989.
[20] I. Ajzen, ‘‘The theory of planned behavior,’’ Org. Behav. Hum. Decis. Process., vol. 50, no. 2, pp. 179–211, 1991.
[21] A. Isil, ‘‘Computer self-efficacy, computer anxiety, performance and personal outcomes of Turkish physical education teachers,’’ Educ. Res. Rev., vol. 10, no. 3, pp. 328–337, 2015
[22] R. P. Bagozzi, F. D. Davis, and P. R. "Warshaw, Extrinsic and intrinsicmotivation to use computers in the workplace", J. Appl. Soc. Psychol.,vol. 22, no. 14, pp. 1111–1132, Jul. 1992.
[23] S.-C. Yeh, J.-L. Wang, C.-Y. Wang, P.-H. Lin, G.-D. Chen, and A. Rizzo, ‘‘Motion controllers for learners to manipulate and interact with 3D objects for mental rotation training,’’ Brit. J. Educ. Technol., vol. 45, no. 4, pp. 666–675, 2014.
[24] A. Bhattacherjee, ‘‘Understanding information systems continuance: An expectation-confirmation model,’’ MIS Quart., vol. 25, no. 3, pp. 351–370, 2001.
[25] A. C. Zapf, L. A. Glindemann, K. Vogeley, and C. M. Falter, ‘‘Sex differences in mental rotation and how they add to the understanding of autism,’’ PLoS ONE, vol. 10, no. 4, p. e0124628, 2015
[26] PHAM, Hong An. The challenge of hand gesture interaction in the Virtual Reality Environment: evaluation of in-air hand gesture using the Leap Motion Controller. 2018.
[27] Petri, K., Feuerstein, K., Folster, S., Bariszlovich, F., & Witte, K. (2020). Effects of Age, Gender, Familiarity with the Content, and Exposure Time on Cybersickness in Immersive Head-mounted Display Based Virtual Reality. American Journal of Biomedical Sciences, 12(2).
[28] CHANG, Chen-Wei, et al. Examining the Effects of HMDs/FSDs and Gender Differences on Cognitive Processing Ability and User Experience of the Stroop Task-Embedded Virtual Reality Driving System (STEVRDS). IEEE Access, 2020, 8: 69566-69578.
[29] Basic gesture, from XRSPACE, accessed 30 May 2021, https://supp ort.xrspace.io/hc/zh-tw
[30] Controller button function, from XRSPACE accessed 30 May 2021, https://support.xrspace.io/hc/zh-tw
[31] Controller basic button function, from XRSPACE accessed 30 May 2021, https://support.xrspace.io/hc/zh-tw
[32] Munsinger, Brita, and John Quarles. "Augmented reality for children in a confirmation task: Time, fatigue, and usability." 25th ACM Symposium on Virtual Reality Software and Technology. 2019.
[33] Kharoub, Hind, Mohammed Lataifeh, and Naveed Ahmed. "3d user interface design and usability for immersive vr." Applied Sciences 9.22 (2019): 4861.
[34] Voigt-Antons, Jan-Niklas, et al. "Influence of hand tracking as a way of interaction in virtual reality on user experience." 2020 Twelfth International Conference on Quality of Multimedia Experience (QoMEX). IEEE, 2020.
[35] Yang, L. I., et al. "Gesture interaction in virtual reality." Virtual Reality & Intelligent Hardware 1.1 (2019): 84-112.
[36] Tanjung, K., et al. "The use of virtual reality controllers and comparison between vive, leap motion and senso gloves ap-plied in the anatomy learning system." Journal of Physics: Con-ference Series. Vol. 1542. No. 1. IOP Publishing, 2020.
指導教授 葉士青 吳曉光(Shih-Ching Yeh Eric Hsiao-kuang Wu) 審核日期 2021-8-16
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明