English  |  正體中文  |  简体中文  |  Items with full text/Total items : 73032/73032 (100%)
Visitors : 23388069      Online Users : 490
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version

    Please use this identifier to cite or link to this item: http://ir.lib.ncu.edu.tw/handle/987654321/69477

    Title: 基於分析關鍵動量相關性之動態紋理轉換;Dynamic Texture Transformation by Strategic Motion Coherence Analysis
    Authors: 金功勳;Wattanachote,Kanoksak
    Contributors: 資訊工程學系
    Keywords: 動態紋理;動量相關性分析;動量模板配對;影片編輯;動態紋理轉換;特殊效果;Dynamic textures;Motion coherence analysis;Motion template matching;Video editing;Dynamic texture transformation;Special effects
    Date: 2016-01-14
    Issue Date: 2016-03-17 20:44:47 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 改變影片中的動態紋理在動量、顏色等特性之分部,很可能會產生不一樣的 視覺效果,例如,在一個影片中的瀑布紋理可以轉換成火焰紋理或其他等等。本 論文利用分析關鍵動量,針對兩個動態紋理之間的轉換提出一個新的方法。
    有複雜的形狀或動量的動態紋理,難以用具體化模型表示,也難以預測,特 別是要把它轉換成新的動量紋理。本論文基於小區域像素差異以及動量的相關性, 對連續影像的動態紋理轉換建構一個演算法。本論文真對設計的演算法提供了互 動式介面,成功演示所提出之演算法套用於豐富特殊效果的影片上。動態紋理轉 換的過程中我們處理了如何創造3D元素、分析動量關聯和元素間的配對問題。本 篇論文主要的貢獻包含兩個議題,第一個是對動量相關性的評估提出新的測量方 式,並用實際測試驗證此方式是否有用,亦即更接近人眼能接受的程度。第二個 是對自動化動態紋理轉換建構了一個演算法,此最佳化的演算法只需要使用者先 選擇不同的臨界值在第一個影格標記出紋理區域,剩下的轉換過程就會自動完成。 實驗結果顯示出動量相關性在元素配對和轉換中,能有效找出有關聯的動量區 域。
    此外,每個動態紋理動量相關性的辨別在本論文中也有進行觀察和分析。從 區別動量相關性得到的貢獻,也許能被其他在研究動量相關的開發所利用。舉例 來說,將整合動量分析模型應用到現有的監視系統中,將為下一代數位保安系統 的帶來更多優點並大幅度改善現有缺點。;Changing dynamic texture appearances can create new looks in both motion and color appearance of videos. For instance, waterfall texture in a video scene can be appeared as a fire texture or vice versa. This dissertation proposes a novel method for dynamic texture transformation between two dynamic textures by strategic motion coherence analysis.
    Dynamic textures with sophisticated shape and motion appearance are difficult to represent by physical models and are hard to predict, especially for transformation to new motion texture. This study proposes dynamic texture transformation algorithms for video sequences based on the difference of pixel intensity and motion coherence of patches. This research successfully applies the technology in many special effect videos, using an interactive tool developed. Our study addresses the issues of 3D patch creation, motion coherence analysis and patch matching for dynamic texture transformation. The main contribution includes two issues. First is to propose a new metric for evaluating motion coherence, with solid tests to justify the usefulness (close to human eye perception). The second is to propose algorithms for automatic dynamic texture transformation. An optimized algorithm only needs users to target the texture on the first frame, by using an optional threshold to determine the texture area. The rest process is complete automatically by our proposed algorithms. The experimental results show that the motion coherence index is effectively used to find the coherent motion region for patch matching and transformation.
    Besides, the distinct of motion coherence for each dynamic texture is also observed and analyzed to discuss in this research. A contribution found from the distinct of motion coherence is possibly aimed to leverage the motion coherence for other system development. For instance, the digital security system of next generation by improving strong point and significantly complementing defects of the existing closed circuit television (CCTV) system, by integrating motion analysis module into the existing surveillance system.
    Appears in Collections:[資訊工程研究所] 博碩士論文

    Files in This Item:

    File Description SizeFormat

    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback  - 隱私權政策聲明