本文提出了一種即時虛擬服裝跟踪投影系統,希望開發新的投影應用應用,增加舞台表演的可能性,節省表演者在服裝上的成本,攜帶的不便,以及改變時間。服裝。結合技術,展現新風格,解決舞台的高支出,實現人的輕鬆即時生產目的。我們使用Kinect作為人體檢測的唯一傳感器,跟踪用戶並檢測他們的骨骼,通過快速有效的坐標轉換將虛擬3D坐標轉換為投影空間的3D坐標,並將骨骼鏈接到虛擬服裝。在每個部分中,通過使用子關節的運動學來計算子關節旋轉角度,並且模擬關節的自然姿勢,並且遵循虛擬服裝並且符合用戶的結果。各種服裝,提供不同的交互條件,如舉起手,或使用特定的手勢及時改變或觸發服裝的舞台效果,使用戶在各種表演中有足夠的可能性。結果顯示了預期的結果,並且還採用了演示膜,系統使用簡單,使用方便。;The issue of human-computer interaction has been particularly popular in recent years. Its application with Mixed Reality is also a technology that cannot be separated from our lives in the future. You will want to interact more naturally with computer. So the Human Computer Interaction (HCI) has become an important issue of computer science. The VR gives priority to the development in the future. The HCI makes people communicate with computers easier. The goal is presenting more interactive experience as better as possible.
In many performances, performance clothing is often seen as one of the important factors affecting the level of performance, so many performances require a lot of costumes to be prepared and frequently changed between the two performances. Facing the audience, the actors rely on the outline of the story and add their own creative improvisation to complete a perfect performance. Recently, due to the professional stage, there are more and more demand for performance applications combined with light-carving effects, but the application of light has always been limited. If you want to start a show, you should set up the stage and prepare multiple pieces of clothing. This is very troublesome for the performer. Young people always like fashionable things, and hopes can change quickly. Therefore, this paper wants to extend the application to other levels for the brilliance and fluency of the performance. If we need to change our clothes for every show, changing clothes during the performance and even carrying clothes are both problems. To improve these problems and increase performance variability.
In this paper, we propose a method for users to project virtual clothing on themselves using a computer with a projector. Kinect will capture the user\textquoteright{}s body and bones as well as each location and direction. The skeleton is mapped to each part of the virtual garment, and the rotation angle of the sub-joint is calculated by the kinetic kinematics to simulate the natural posture of the jointed joint, and the virtual garment follows and conforms to the user\textquoteright{}s result. The implementation the three steps method of a fast and efficient space coordinate transforming from camera coordinates to a real-world three-dimensional space coordinate to project and control virtual clothing in real time. Users can choose different costumes in our system. Anyone can easily wear virtual costumes during the show.