中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/98515
English  |  正體中文  |  简体中文  |  全文笔数/总笔数 : 83776/83776 (100%)
造访人次 : 59531129      在线人数 : 769
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻


    jsp.display-item.identifier=請使用永久網址來引用或連結此文件: https://ir.lib.ncu.edu.tw/handle/987654321/98515


    题名: 基於擴散模型的生成式 AI;A Mathematical Study of Generative AI Models
    作者: 許展榮;HSU, CHAN-JUNG
    贡献者: 數學系
    关键词: 生成式模型;擴散模型;一致性模型;generative AI model;Diffusion model;Consistency model
    日期: 2025-07-08
    上传时间: 2025-10-17 12:52:30 (UTC+8)
    出版者: 國立中央大學
    摘要: 隨著深度學習在圖像生成領域的快速發展,擴散模型(Diffusion Models)已成為一種兼具生成品質與靈活性的前沿方法。本論文系統性地回顧了三大類擴散生成模型:一是以離散馬可夫鏈為基礎的 Denoising Diffusion Probabilistic Models(DDPM);二是透過隨機微分方程(SDE)與分數匹配(score matching)建立的 Score–Based Diffusion Models;三是以「一致性訓練」(Consistency Training)為核心,旨在以單步或少量步驟完成高品質生成的 Consistency Models。在實驗部分,本研究採用 MNIST、CIFAR-10 與 Cat Faces 三個資料集,並以Fréchet Inception Distance(FID)評估生成影像之真實度,同時計算反向取樣過程的函數評估次數(NFE)以衡量效率。結果顯示:DDPM 在相同步數下達到最低 FID.Consistency Models 雖能將取樣速度提升數倍,但在採用 LPIPS 感知損失時,其生成品質方能逼近傳統擴散模型。;Diffusion models have recently emerged as a powerful class of generative techniques, offering state-of-the-art sample quality in image generation tasks. This thesis provides a
    comprehensive review of three representative diffusion-based frameworks: (1) Denoising Diffusion Probabilistic Models (DDPM), which employ discrete Markov chains to model forward noising and reverse denoising; (2) Score-Based Diffusion Models, which generalize this approach via continuous-time stochastic differential equations (SDEs) and scorematching; and (3) Consistency Models, which aim to collapse iterative sampling into a single or few learned steps through consistency training. We conduct experiments on MNIST, CIFAR-10, and a Cat Faces dataset, evaluat-ing generation fidelity using the Fréchet Inception Distance (FID) and measuring by the Number of Function Evaluations (NFE). Our findings reveal that DDPM consistently achieves the lowest FID under equal step counts. Consistency Models dramatically accelerate sampling—reducing NFE by orders of magnitude—but only attain comparable fidelity when trained with a perceptual LPIPS loss.
    显示于类别:[數學研究所] 博碩士論文

    文件中的档案:

    档案 描述 大小格式浏览次数
    index.html0KbHTML28检视/开启


    在NCUIR中所有的数据项都受到原著作权保护.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明