English  |  正體中文  |  简体中文  |  全文筆數/總筆數 : 83696/83696 (100%)
造訪人次 : 56972868      線上人數 : 4999
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜尋範圍 查詢小技巧:
  • 您可在西文檢索詞彙前後加上"雙引號",以獲取較精準的檢索結果
  • 若欲以作者姓名搜尋,建議至進階搜尋限定作者欄位,可獲得較完整資料
  • 進階搜尋


    請使用永久網址來引用或連結此文件: https://ir.lib.ncu.edu.tw/handle/987654321/98754


    題名: DisCode-NILM:一種生成與解碼兼具的非侵入式負載分析模型;DisCode-NILM: A Data-Efficient Non-Intrusive Load Monitoring with Discrete Encoding and Generative Transformer
    作者: 吳以律;Wu, Yi-Lu
    貢獻者: 機械工程學系
    關鍵詞: 非侵入式負載監測;離散編碼字典;資料擴增;生成模型;負載分解;Non-intrusive load monitoring;discrete codebook;data augmentation;generative model;load disaggregation
    日期: 2025-08-22
    上傳時間: 2025-10-17 13:15:36 (UTC+8)
    出版者: 國立中央大學
    摘要: 非侵入式負載監測(Non-Intrusive Load Monitoring, NILM)旨在從單一總用電訊號推估各別電器的用電資訊。隨著智慧電網與物聯網的發展,NILM技術可協助用戶瞭解用電明細並提升能源效率。然而,傳統方法面臨許多挑戰,包括不同電器訊號交疊、資料稀少以及負載訊號的非平穩特性等問題。為了解決這些挑戰,本論文提出一種基於離散編碼字典的深度學習架構,用於NILM的資料擴增與負載分解任務。本研究首先針對電器啟動事件進行資料視窗擷取,利用向量量化變分自編碼器(Vector Quantized-Variational Autoencoder, VQ-VAE)配合遮罩式生成Transformer 模型(Masked Generative Transformer, 又稱 MaskGIT)建立單一電器的資料生成器,以生成擬真的單電器用電曲線供資料增強使用。在負載分解方面,本論文設計了一種多電器分解架構:包含共享的小波轉換編碼器、電器專用的適配層與離散量化器,以及用於預測各電器狀態的分類器。該架構透過共享時頻特徵提取和專用編碼字典約束,實現對總負載的高準確度分解。在英國與美國家庭用電資料集(UK-DALE、REDD、REFIT)上的實驗結果顯示,所提方法在各電器之辨識精度上均優於傳統基準模型,F1-score 提升約10–15%,並在資料不足情況下保持穩定表現。同時,透過消融實驗驗證了離散編碼字典、資料擴增以及小波編碼器對模型效能的貢獻。本研究證明了結合離散表示學習與生成模型可有效強化NILM的資料擴增與分解能力,為智慧用電分析領域提供新穎的解決方案。;Non-intrusive load monitoring (NILM) aims to infer individual appliance power usage from a single aggregate measurement. With the growth of smart grids and IoT, NILM enables users to understand detailed energy consumption and improve efficiency. However, conventional NILM approaches face challenges such as overlapping appliance signals, limited labeled data, and non-stationary load characteristics. To address these issues, this thesis proposes a deep learning architecture based on a discrete codebook for data augmentation and load disaggregation in NILM. We first perform event-based segmentation of appliance activation periods and develop a single-appliance data generator using a Vector Quantized Variational Autoencoder (VQ-VAE) combined with a Masked Generative Transformer (MaskGIT). This generator learns a discrete codebook of appliance power patterns and can synthesize realistic appliance load sequences for data augmentation. Next, for load decomposition, we design a multi-appliance disaggregation model consisting of a shared wavelet transform encoder, appliance-specific adaptation layers and quantizers, and classifiers for appliance state prediction. By leveraging shared time-frequency feature extraction and enforcing appliance-specific codebook constraints, the proposed model achieves accurate disaggregation of the total load. Experiments on UK-DALE, REDD, and REFIT household datasets demonstrate that our method outperforms traditional baseline models in appliance recognition accuracy, achieving about 10–15% higher F1-scores, and maintains robust performance under limited data. Ablation studies further confirm the contribution of the discrete codebook, data augmentation, and wavelet encoder to the model’s effectiveness. In summary, this research shows that integrating discrete representation learning with generative models can effectively enhance data augmentation and disaggregation in NILM, offering a novel solution for smart energy analytics.
    顯示於類別:[機械工程研究所] 博碩士論文

    文件中的檔案:

    檔案 描述 大小格式瀏覽次數
    index.html0KbHTML5檢視/開啟


    在NCUIR中所有的資料項目都受到原著作權保護.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明