dc.description.abstract | A traditional war simulation or war game is to use a game table, a map and some pieces representing different forces to build a combat simulation environment with staff officers moving pieces around on the game table to deduce, analyze, and record changes resulted from the simulated military strategies to achieve tactical planning purposes. Since the conventional war game is very time consuming and its operating environment has many limitations, computer-based war games are usually adopted to replace the traditional war games now. Although computer-based war games outperform the traditional war games, they require human staffs to have the expertise to operate the war game systems. In addition, they are lack of intuitive physical actions during the deduction of military operations.
This thesis integrates a Microsoft Kinect 3D depth camera with a projector to create a mixed-reality-based interactive war simulation (or war game) platform. This interactive platform owns many appealing characteristics such as intuitive embodied controls, large-scale touch screen functionality, and a real-time provision of geographic map information to meet different military practices’ requirements. The proposed system will automatically record the whole war game deduction procedures and then use animations to re-play the recorded deduction procedures at the end of the war game assignment. The recorded historical data would be the basis for reviewing the military practices and force troop’s deployment deductions. Via this kind of environment settings, the proposed interactive platform allows users to use embodied control to accomplish a military practice deduction. In addition, we use the Kinect depth camera as a touch sensor to make the table serve as a large-scale touch panel. This kind of arrangement can not only overcome the inconvenience incurred by the re-location of a large-scale panel but also reach the development goal under a small budget.
Finally, several experiments were designed to evaluate the functionalities of the proposed mix-reality-based war simulation platform. In the touch experiments, the rate of instructions correct is 97.1%. The correct rate is 97.42% in the experiment with objects at different angles, and in the shift- experiments the correct rate is 87.49% in our proposed system. Besides, we use the system usability scale to measure our system, and the score is 77.14. | en_US |