中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/77587
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 80990/80990 (100%)
Visitors : 41739539      Online Users : 1559
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/77587


    Title: 基於深度資訊與應用肌電訊號深度學習之虛擬大提琴設計;The design of virtual cello based on depth image and electromyography using deep learning
    Authors: 王振庭;Wang, Cheng-Ting
    Contributors: 資訊工程學系
    Keywords: 手部追蹤;手勢辨識;肌電訊號;虛擬樂器;卷積類神經網絡;MIDI;Hand tracking;Gesture recognition;Electromyography;Convolutional neural network(CNN);Virtual instrument;MIDI
    Date: 2018-07-20
    Issue Date: 2018-08-31 14:49:07 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 現今,人機互動的技術主要發展是將使用者的指令傳送給電腦,而開發一個自然且直觀的交互技術是人機互動的重要目標。
    通常,使用者只使用滑鼠和鍵盤與電腦進行互動。
    目前這項研究主要針對在互動上更自然、直接且有效的與電腦溝通。
    讓使用者可以輕易透過手,頭,臉部表情,語音和肌電訊號與電腦進行互動,並且將介面設計更自然且直觀,可以用於擴增實境和虛擬實境。
    基於與人交互的界面是與計算機進行交互的自然而直觀的方式, 這樣的介面將用於AR / VR的環境。

    在本文中,我們在虛擬實境中使用手部追蹤與識別設計了一個即時虛擬大提琴。
    使用者只需坐在桌子前面並將使用者的手放置相機前。取得使用者的手勢後,使用者可以演奏虛擬大提琴。
    程式可靈活彈性調整的,使用者可以調整樂器的把位、和弦、音調高/低、音色。
    我們使用Realsense和Myo感測器來取得使用者的手部資訊。
    Realsense負責手部追蹤以觸發聲音。
    Myo感測器負責手勢辨識以控制MIDI功能。
    然而,我們使用卷積神經網絡(CNN)來辨識和分析表面肌電圖的四個靜態手勢。
    在系統中,我們使用OpenGL繪製手部模型以及使用者介面,並且使用OpenCV協助圖像處理。
    我們使用Rtmidi程式庫生成MIDI訊息並傳輸到數位音訊工作站(DAW)。
    通過使用附載VST,使樂器聽起來更真實。
    雖然這個系統無法播放速度很高的歌曲,但對於慢歌曲來說,它是穩定的,可以用於專業音樂表演。
    在實驗中,我們用表面肌電圖(sEMG)訓練一個卷積神經網絡模型,
    實驗結果顯示,特定用戶數據集的準確率為94.3%,一般數據集的準確率為86.1%。
    因此,我們還使用動態時間扭曲來評估虛擬大提琴。
    這個評估可以直接測量虛擬樂器產生的MIDI和標準MIDI的旋律相似性。;Nowadays new technologies of Human Computer Interaction (HCI) are being developed to deliver user′s command to the computer.
    Developing natural and intuitive interaction techniques is an important goal in HCI.
    Typically, users interact with the computer using mouse and keyboard.
    Currently, the research is directed towards a new part of the interaction that provides a way to more natural, direct and effective communication.
    Let users can interact with the computers through the hand, head, facial expressions, voice, and electromyography signal.
    Interface based on interaction with hands is a natural and intuitive way to interact with the computers. Such an interface could be used for AR/VR environment.


    This paper aims to apply the hand tracking and recognition used in virtual reality (VR) technology to a real-time application for the musical instrument, a virtual cello.
    The proposed application can play the realistic sound of the musical instrument.
    The user only needs to sit in front of the table and raises user’s hand to face the camera. The system will start playing the virtual cello after capturing the user′s hand gesture.
    The program is flexible, the user can adjust parameters like key notes, chords, pitch up/down and tone mode.
    We used Realsense and Myo sensor to capture the hand information of the user at the application.
    The Realsense is responsible for hand tracking to trigger the sound.
    The Myo sensor is responsible for hand gesture recognition to control MIDI functions.
    However, we use the convolutional neural network (CNN) to recognize and analyze the four static hand gestures from surface electromyography.
    In the system, we use OpenGL to draw our interface and display the 3D model in the screen, the OpenCV is helping us to process image.
    We used Rtmidi library to generate the MIDI message and transfer to the digital audio workstation (DAW).
    By using plugin virtual studio technology (VST) to make the musical instrument sound more realistic.
    Although this system could not play the song which has a high speed, for slow songs, it is stable and could be used in a professional music performance.
    In the experiment, we trained the convolutional neural network model with surface electromyography (sEMG)
    The experimental results demonstrated that the accuracy is 94.3\% in the specific-user dataset and 86.1\% in general dataset.
    Therefore, we also used dynamic time warping to evaluate the virtual cello.
    This evaluation can be directly extended to measure the melodic similarity of generating the MIDI of virtual instrument and standard-MIDI.
    Appears in Collections:[Graduate Institute of Computer Science and Information Engineering] Electronic Thesis & Dissertation

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML132View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明