中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/98515
English  |  正體中文  |  简体中文  |  Items with full text/Total items : 83776/83776 (100%)
Visitors : 59531129      Online Users : 769
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version


    Please use this identifier to cite or link to this item: https://ir.lib.ncu.edu.tw/handle/987654321/98515


    Title: 基於擴散模型的生成式 AI;A Mathematical Study of Generative AI Models
    Authors: 許展榮;HSU, CHAN-JUNG
    Contributors: 數學系
    Keywords: 生成式模型;擴散模型;一致性模型;generative AI model;Diffusion model;Consistency model
    Date: 2025-07-08
    Issue Date: 2025-10-17 12:52:30 (UTC+8)
    Publisher: 國立中央大學
    Abstract: 隨著深度學習在圖像生成領域的快速發展,擴散模型(Diffusion Models)已成為一種兼具生成品質與靈活性的前沿方法。本論文系統性地回顧了三大類擴散生成模型:一是以離散馬可夫鏈為基礎的 Denoising Diffusion Probabilistic Models(DDPM);二是透過隨機微分方程(SDE)與分數匹配(score matching)建立的 Score–Based Diffusion Models;三是以「一致性訓練」(Consistency Training)為核心,旨在以單步或少量步驟完成高品質生成的 Consistency Models。在實驗部分,本研究採用 MNIST、CIFAR-10 與 Cat Faces 三個資料集,並以Fréchet Inception Distance(FID)評估生成影像之真實度,同時計算反向取樣過程的函數評估次數(NFE)以衡量效率。結果顯示:DDPM 在相同步數下達到最低 FID.Consistency Models 雖能將取樣速度提升數倍,但在採用 LPIPS 感知損失時,其生成品質方能逼近傳統擴散模型。;Diffusion models have recently emerged as a powerful class of generative techniques, offering state-of-the-art sample quality in image generation tasks. This thesis provides a
    comprehensive review of three representative diffusion-based frameworks: (1) Denoising Diffusion Probabilistic Models (DDPM), which employ discrete Markov chains to model forward noising and reverse denoising; (2) Score-Based Diffusion Models, which generalize this approach via continuous-time stochastic differential equations (SDEs) and scorematching; and (3) Consistency Models, which aim to collapse iterative sampling into a single or few learned steps through consistency training. We conduct experiments on MNIST, CIFAR-10, and a Cat Faces dataset, evaluat-ing generation fidelity using the Fréchet Inception Distance (FID) and measuring by the Number of Function Evaluations (NFE). Our findings reveal that DDPM consistently achieves the lowest FID under equal step counts. Consistency Models dramatically accelerate sampling—reducing NFE by orders of magnitude—but only attain comparable fidelity when trained with a perceptual LPIPS loss.
    Appears in Collections:[Graduate Institute of Mathematics] Electronic Thesis & Dissertation

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML28View/Open


    All items in NCUIR are protected by copyright, with all rights reserved.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明