中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/98515
English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 83776/83776 (100%)
造訪人次 : 59534979      線上人數 : 673
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋


    請使用永久網址來引用或連結此文件: https://ir.lib.ncu.edu.tw/handle/987654321/98515


    題名: 基於擴散模型的生成式 AI;A Mathematical Study of Generative AI Models
    作者: 許展榮;HSU, CHAN-JUNG
    貢獻者: 數學系
    關鍵詞: 生成式模型;擴散模型;一致性模型;generative AI model;Diffusion model;Consistency model
    日期: 2025-07-08
    上傳時間: 2025-10-17 12:52:30 (UTC+8)
    出版者: 國立中央大學
    摘要: 隨著深度學習在圖像生成領域的快速發展,擴散模型(Diffusion Models)已成為一種兼具生成品質與靈活性的前沿方法。本論文系統性地回顧了三大類擴散生成模型:一是以離散馬可夫鏈為基礎的 Denoising Diffusion Probabilistic Models(DDPM);二是透過隨機微分方程(SDE)與分數匹配(score matching)建立的 Score–Based Diffusion Models;三是以「一致性訓練」(Consistency Training)為核心,旨在以單步或少量步驟完成高品質生成的 Consistency Models。在實驗部分,本研究採用 MNIST、CIFAR-10 與 Cat Faces 三個資料集,並以Fréchet Inception Distance(FID)評估生成影像之真實度,同時計算反向取樣過程的函數評估次數(NFE)以衡量效率。結果顯示:DDPM 在相同步數下達到最低 FID.Consistency Models 雖能將取樣速度提升數倍,但在採用 LPIPS 感知損失時,其生成品質方能逼近傳統擴散模型。;Diffusion models have recently emerged as a powerful class of generative techniques, offering state-of-the-art sample quality in image generation tasks. This thesis provides a
    comprehensive review of three representative diffusion-based frameworks: (1) Denoising Diffusion Probabilistic Models (DDPM), which employ discrete Markov chains to model forward noising and reverse denoising; (2) Score-Based Diffusion Models, which generalize this approach via continuous-time stochastic differential equations (SDEs) and scorematching; and (3) Consistency Models, which aim to collapse iterative sampling into a single or few learned steps through consistency training. We conduct experiments on MNIST, CIFAR-10, and a Cat Faces dataset, evaluat-ing generation fidelity using the Fréchet Inception Distance (FID) and measuring by the Number of Function Evaluations (NFE). Our findings reveal that DDPM consistently achieves the lowest FID under equal step counts. Consistency Models dramatically accelerate sampling—reducing NFE by orders of magnitude—but only attain comparable fidelity when trained with a perceptual LPIPS loss.
    顯示於類別:[數學研究所] 博碩士論文

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    index.html0KbHTML28檢視/開啟


    在NCUIR中所有的資料項目都受到原著作權保護.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明