dc.description.abstract | In recent years, the development of virtual reality (VR) has become a focal point of public
attention. With the emergence of more VR products and applications, the performance of VR
devices has continuously improved, and the cost has significantly decreased, gradually
becoming essential devices for individuals. VR provides an immersive experience that
not only offers visual enjoyment to users but also provides interactive modes different
from traditional methods. Through VR technology, users can engage in various activities
in virtual environments, such as gaming, meetings, education, and healthcare.
The significance of VR technology continues to increase, and its applications
in the virtual instrument field, such as virtual pianos and virtual jazz drums,
are becoming increasingly widespread. The emergence of these virtual instruments
not only allows users to experience the pleasure of playing instruments
in virtual environments but also lowers the barrier to learning different instruments.
With VR devices, users can enjoy playing instruments anytime and anywhere,
free from limitations of location, time, space, equipment, and technical expertise.
Consequently, virtual concerts, including virtual spatial audio simulations and 3D
reconstructions of historical performances, have gained more attention.
However, in previous studies, virtual guitars were mostly conducted in non-VR environments,
focusing primarily on the recognition of air guitar chords.
There has been a lack of systematic research on virtual air guitar systems within VR.
Moreover, commercial virtual guitar games currently have limited accuracy
in recognizing hand gestures, only able to detect finger bending and simple strumming actions,
but unable to accurately identify chords and various strumming techniques.
Therefore, in this study, we propose a virtual air guitar system that allows users
to play the guitar simply through VR devices. By leveraging the recognition capabilities
of deep learning models and the visual feedback advantages of VR, our system can recognize
up to 30 different chords and implement various strumming techniques using a joystick device.
Furthermore, we apply a black-box approach by combining WaveNet and FiLM to simulate
the effects of an electric guitar pedal at different knob settings. Additionally,
we introduce a Knob Difference Loss to improve the accuracy of the simulated effects.
In terms of the network architecture, we propose the Kernel Dilation technique,
which doubles the forward speed of WaveNet used in previous studies without sacrificing
accuracy. This enables real-time simulation of electric guitar effects even under
high-performance computing VR environments,
using an Intel 7 11700 K processor (released in 2021)
and an NVIDIA RTX 1060 graphics card (released in 2016). | en_US |