摘要(英) |
In this generation, people need to find a new way to interact directly with device, such as computer, smart phone, note book. Researchers are constantly studying how to make it easier to user to communicate with these devices, user also hope through different devices to achieve some common use like instrument editor, Internet, play games or video editing and so on, because I found that only use mouse and keyboard did not intuitive, I want to interact more naturally with device.
In this paper, I propose a virtual 3D plucked instruments performance system. User only need to sit in front of table and raise hand to face camera, load the Leap hand model into system, if system start to execute, detect hand model of user, and show it in Unity 3D. Then I can interact with plucked instruments model that is design by myself. I use Leap motion as our device. Compare other device with Leap motion, it is more focus on accuracy of the finger. I can through Unity 3D to develop relevant programs and interface.
For performance, I need to design 3D plucked instrument placements, make it easier to operate. In part of playing virtual plucked instruments with fingertip, I implement a finger data collection system, the goal is know the range of finger motion of stability. Therefore, user can avoid to do the gesture of instability, when user practice virtual plucked instruments.
According to entity Harp and Guqin in early generation, I proposed a new kind of virtual instrument. The strings design in accordance with the entity Harp and Guqin, preset string is natural pitch, and use different audio file to change pitch. In part of control volume, I divide string into five regions, and set it different volume. In addition, I can change volume for each regions. The limit of the Virtual Instrument is music speed. Although it could not play the songs which have a high bpm, for slow songs, it is stable and could be used in a professional music performance. |
參考文獻 |
[1]
Peggy Wright, Diane Moser-wooley, and Btuce Wooley, "Technique & Tools for Using Color in computer Interface Design," ACM, Magazine Crossroads - Special issue on human computer interaction, Mar. 1997, pp.3-6.
[2]
Jihyun Han, and Nicolas Gold, “Lessons Learned in Exploring the Leap Motion Sensor for Gesture-based Instrument Design,” New Interface Musical Expression (NIME), 2014.
[3]
Ching-Hua Chuan, Eric Regina, and Caroline Guardio,”American Sign Language Recognition Using Leap Motion Sensor,” Machine Learning and Application (ICMLA), 2014 13th International Conference on IEEE, pp. 541-544.
[4]
Daniel Plemmons and David Holz,”Creating Next-Gen 3D Interactive Apps with Motion Control and Unity3D,” proceeding of Magazine SIGGRAPH ’14 ACM SIGGRAPH 2014 Studio Article, No.24.
[5]
Wonsun Lee, Chulhee Lee, Sungin Hong, Seongah Chin, "Dementia-Prevention Serious Game Techniques using Finger Motion, Object Deformation, and Particle Mapping with Leap-Motion", Proc. of the Intl. Conf. on Advances In Computing, Electronics and Electrical Technology - CEET 2014.
[6]
Pongphan Pongpanitanont and Warakorn Charoensuk, “Leap Motion Signal Preservation and Medical Training System,” IEEE international conference on Biomedical Engineering International (BMEiCON), 2014 , pp. 1-4.
[7]
Leigh Ellen Potter, Jake Araullo and Lewis Carter, “The Leap Motion Controller: A View on Sign Language,” Proceedings of the 25th Australian Computer-Human Interaction, Conference: Augmentation, Application, Innovation, Collaboration, pp. 175-178.
[8]
Joanna C. Coelho and Fons J. Verbeek, “Pointing Task Evaluation of Leap Motion Controller in 3D Virtual Environment,” Proceedings of the Chi Sparks, Apr. 2014 Conference, pp. 78-85.
[9]
Leigh Ellen Potter, Jake Araullo, Lewis Carter, "The Leap Motion controller: A view on sign language", Proceeding OzCHI ′13 Proceedings of the 25th Australian Computer-Human.
[10]
Hongzhe Liu, Yulong Xi, Wei Song, Kyhyun Um, Kyungeun Cho, "Gesture-based NUI Application for Real-time Path Modification", 2013 IEEE 11th International Conference on Dependable, Autonomic and Secure Computing, pp. 446-449.
[11]
P. Chophuk, S. Chumpen, S. Tungjitkusolmun, P. Phasukkit, "Hand Postures for Evaluating Trigger Finger Using Leap Motion Controller", The 201S Biomedical Engineering International Conference (BMEiCON-201S), pp. 1-4.
[12]
Giulio Marin,Fabio Dominio, Pietro Zanuttigh, "Hand Postures for Evaluating Trigger Finger Using Leap Motion Controller", 2014 IEEE International Conference on Image Processing (ICIP), pp. 1565-1569.
[13]
Liberios Vokorokos, Juraj Mihalov, Lubor Lescisin, "Possibilities of Depth Cameras and Ultra Wide Band Sensor ", 2016 IEEE 14th International Symposium on Applied Machine Intelligence and Informatics (SAMI), pp. 57-61.
[14]
Imad Afyouni, Faizan Ur Rehman, Ahmad Qamar, Akhlaq Ahmad, Mohamed Abdur Rahman, Saleh Basalamah, "A GIS-based serious game recommender for online physical therapy", HealthGIS ′14 Proceedings of the Third ACM SIGSPATIAL International Workshop on the Use of GIS in Public Health, pp. 1-10.
[15]
Arun Kulshreshth, Joseph J. LaViola, "Exploring the usefulness of finger-based 3D gesture menu selection", CHI ′14 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1093-1102.
[16]
Nur Atiqah Sia Abdullah, Nur Ida Aniza Rusli, Mohd Faisal Ibrahim, "Mobile game size estimation: COSMIC FSM rules, UML mapping model and Unity3D game engine ", Open Systems (ICOS), 2014 IEEE Conference on, pp. 42-47.
[17]
Joyce Horn Fonteles, Edimo Sousa Silva, Maria Andr´eia Formico Rodrigues, "Gesture-Driven Interaction Using the Leap Motion to Conduct a 3D Particle System: Evaluation and Analysis of an Orchestral Performance", SBC Journal on Interactive Systems, volume 6, number 2, 2015.
[18]
Nandasiri, K.G.M.P., Nawarathna, N.H.C.E.M, Mohamad, M.M.R., Herath, H.M.C.K., Kasthuriarachchi, K.T.S., Wijendra, D., "Advance Technology for Kids to Improve Knowledge and Skills using Motion Gesture Recognition – Leap Mania", NCTM - SLIIT 2014 -DECEMBER. |