dc.description.abstract | With the advent of new technological, the interaction between people and computers has become more and more inseparable. The human-computer interaction (HCI) technology improves the operations in people′s lives. In the past, users usually need to use the mouse and keyboard for system operation. In recent years, in order to make this technology more intuitive and more natural to operate, an indication has been developed to use human hand movements, sounds and fingerprint as computer input. In order to make the technology of the human-computer interaction closer to reality, AR/VR has been developed to connect virtual applications with the real world.The gesture recognition is the basic operation and is one of the hot research topics. Gesture is an intuitive and easy-to-learn interactive method. Users uses hands directly as an input device for a computer. In this paper, we developed a human-computer interactive gesture recognition system for virtual theaters through the wearable device Myo armband. The staff controls the stage objects behind the scenes in the traditional drama. We hope that this system can be free from the restrictions of the space, light and shelter, allowing the performer to directly manipulate the objects of the stage. Finally, provide a new way to combine technology and art.The proposed system used a deep learning method to classify dynamic gestures, and then send instructions to the virtual theater. In the design of the virtual stage, we use Maya to build 3D models and create different scenes with the development kit provided by Unity. Then, transmit the recognition result to Unity through TCP/IP Socket. By using this system, users can easily control the scenes and objects in the theater developed by Unity and make the virtual stage more enriched during the performance. | en_US |