隨著科技的進步,人與電腦的交互活動越來越密不可分,其中人機互動這項技術方便了人們的生活,在過去,使用者通常只需要透過滑鼠和鍵盤即可進行操作,而近年來,為了讓人機互動可以更直觀且自然的方式操作,而發展了利用人類的手部動作、指紋、聲音等去當作電腦輸入的指示,為了讓整體科技的技術更貼近真實,又發展出了AR/VR讓虛擬的應用能與現實世界相互連接在一起。而以手勢為基礎的人機互動系統更是熱門研究的主題之一,手勢是一種直觀且易於學習的交互手段,利用人的手直接做為電腦的輸入設備。在本文中,藉由穿戴式裝置Myo手環開發了一套人機互動的手勢辨識系統應用於虛擬劇場,在傳統的話劇中,通常是由工作人員在幕後進行舞台上的操控,藉由這項系統希望在表演上可以不受場地和光線與遮蔽物的限制,讓演出者可以直接的操控舞台的物件,將藝術與科技結合在一起,在這項系統中,利用手部的肌電訊號當作靜態手勢並且採集手臂的三軸移動資訊當作動態手勢,透過深度學習的方法,帶著穿戴式裝置向電腦發出指令,手勢的軌跡追蹤資料通過藍芽立即傳輸至電腦,可分類出當前手勢動作。在虛擬舞台的設計上,我們利用Maya這款軟體建出3D的模型搭配Unity提供的開發包打造了不同的場景,並利用TCP/IP Socket將手勢辨識結果傳輸至Unity中,使用者可以透過此系統簡單的控制Unity中的場景和物件,讓舞台效果更豐富。 ;With the advent of new technological, the interaction between people and computers has become more and more inseparable. The human-computer interaction (HCI) technology improves the operations in people′s lives. In the past, users usually need to use the mouse and keyboard for system operation. In recent years, in order to make this technology more intuitive and more natural to operate, an indication has been developed to use human hand movements, sounds and fingerprint as computer input. In order to make the technology of the human-computer interaction closer to reality, AR/VR has been developed to connect virtual applications with the real world.The gesture recognition is the basic operation and is one of the hot research topics. Gesture is an intuitive and easy-to-learn interactive method. Users uses hands directly as an input device for a computer. In this paper, we developed a human-computer interactive gesture recognition system for virtual theaters through the wearable device Myo armband. The staff controls the stage objects behind the scenes in the traditional drama. We hope that this system can be free from the restrictions of the space, light and shelter, allowing the performer to directly manipulate the objects of the stage. Finally, provide a new way to combine technology and art.The proposed system used a deep learning method to classify dynamic gestures, and then send instructions to the virtual theater. In the design of the virtual stage, we use Maya to build 3D models and create different scenes with the development kit provided by Unity. Then, transmit the recognition result to Unity through TCP/IP Socket. By using this system, users can easily control the scenes and objects in the theater developed by Unity and make the virtual stage more enriched during the performance.