博碩士論文 112453001 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:23 、訪客IP:18.97.14.91
姓名 吳浩瑋(Hao-Wei Wu)  查詢紙本館藏   畢業系所 資訊管理學系在職專班
論文名稱 深度學習模型於桌上型電腦零件碳排預測之研究
(A Study on Predicting Carbon Emissions of Desktop Computer Components Using Deep Learning Models)
相關論文
★ 台灣50走勢分析:以多重長短期記憶模型架構為基礎之預測★ 以多重遞迴歸神經網路模型為基礎之黃金價格預測分析
★ 增量學習用於工業4.0瑕疵檢測★ 遞回歸神經網路於電腦零組件銷售價格預測之研究
★ 長短期記憶神經網路於釣魚網站預測之研究★ 基於深度學習辨識跳頻信號之研究
★ Opinion Leader Discovery in Dynamic Social Networks★ 深度學習模型於工業4.0之機台虛擬量測應用
★ A Novel NMF-Based Movie Recommendation with Time Decay★ 以類別為基礎sequence-to-sequence模型之POI旅遊行程推薦
★ A DQN-Based Reinforcement Learning Model for Neural Network Architecture Search★ Neural Network Architecture Optimization Based on Virtual Reward Reinforcement Learning
★ 生成式對抗網路架構搜尋★ 以漸進式基因演算法實現神經網路架構搜尋最佳化
★ Enhanced Model Agnostic Meta Learning with Meta Gradient Memory★ 遞迴類神經網路結合先期工業廢水指標之股價預測研究
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 (2030-7-1以後開放)
摘要(中) 面對全球永續與碳中和目標,企業急需於產品設計階段導入低碳策略。桌上型電腦作為高碳電子產品,其零組件在材料與製造等環節皆會影響碳排放,而現行碳排放大多於生產後評估,無法於設計初期即時預測。
本研究提出一套深度學習預測模型,結合自編碼器進行特徵選取、類神經網路進行碳排放回歸預測,並導入知識蒸餾以強化模型精準度與效能。研究建立桌上型電腦零組件碳排放資料集,並於高效能、商務型與輕量型產品中進行模型驗證。
實驗結果顯示,該模型可於設計初期有效預測各組件碳排放,協助企業優化設計並落實低碳設計目標,對電子產業永續發展具有高度實務應用價值。
摘要(英) In the face of global sustainability and carbon neutrality goals, enterprises urgently need to implement low-carbon strategies during the product design stage. Desktop computers, as high-carbon electronic products, generate emissions throughout material selection and manufacturing processes. However, current carbon footprint assessments are mostly conducted post-production, lacking the ability to predict emissions early in the design phase.
This study proposes a deep learning-based predictive model that integrates autoencoders for feature selection, neural networks for carbon emission regression, and knowledge distillation to enhance model accuracy and performance. A carbon emission dataset of desktop computer components was constructed and the model was validated across high-performance, business, and lightweight product categories.
Experimental results show that the proposed model can effectively predict component-level carbon emissions during the early design phase, assisting enterprises in optimizing product design and achieving low-carbon goals. This approach provides strong practical value for promoting sustainability in the electronics industry.
關鍵字(中) ★ 深度學習
★ 碳排放預測
★ 類神經網路
★ 知識蒸餾
★ ESG永續發展
關鍵字(英) ★ Deep Learning
★ Carbon Emission Prediction
★ Neural Network
★ Knowledge Distillation
★ ESG Sustainability
論文目次 摘要 i
ABSTRACT ii
致謝 iii
目錄 iv
表目錄 vi
圖目錄 vii
1 第一章 緒論 1
1.1 研究背景 1
1.2 研究動機與目的 4
1.3 研究貢獻 7
1.4 論文流程與架構 10
2 第二章 文獻探討 12
2.1 碳排放預測與對企業的影響 14
2.2 神經網路(NN)在碳排放預測的應用 15
2.3 知識蒸餾(KD)在機器學習與回歸問題的應用 19
3 第三章 研究方法 24
3.1 資料轉換與前處理 25
3.2 模型架構 26
3.2.1 特徵選取(Autoencoder) 27
3.2.2 神經網路(Neural Network) 30
3.2.3 知識蒸餾(Knowledge Distillation) 33
4 第四章 實驗結果與分析 36
4.1 資料集與前置處理 36
4.2 比較模型 39
4.3 模型效能驗證 – Teacher Model(全部專案) 43
4.4 模型效能驗證 - NNCKD(個別專案) 45
4.5 知識蒸餾參數(Alpha、Temperature) 50
4.6 參數設定探討(Epoch、批次大小、Learning rate) 54
4.7 消融實驗 58
4.8 實例驗證 60
5 第五章 結論與未來研究方向 63
5.1 研究結論 63
5.2 研究限制 64
5.3 未來研究方向 64
6 參考文獻 66
參考文獻 [1] A. Baratta, A. Cimino, F. Longo, V. Solina, and S. Verteramo, "The Impact of ESG
Practices in Industry with a Focus on Carbon Emissions: Insights and Future
Perspectives," Sustainability, vol. 15, no. 8, 2023, doi: 10.3390/su15086685.
[2] T.-T. Li, K. Wang, T. Sueyoshi, and D. D. Wang, "ESG: Research Progress and Future
Prospects," Sustainability, vol. 13, no. 21, 2021, doi: 10.3390/su132111663.
[3] I. Gallego-Álvarez, L. Segura, and J. Martínez-Ferrero, "Carbon emission reduction:
the impact on the financial and operational performance of international companies,"
Journal of Cleaner Production, vol. 103, pp. 149-159, 2015, doi:
10.1016/j.jclepro.2014.08.047.
[4] F. Ganda and K. S. Milondzo, "The Impact of Carbon Emissions on Corporate
Financial Performance: Evidence from the South African Firms," Sustainability, vol.
10, no. 7, 2018, doi: 10.3390/su10072398.
[5] M. F. Victoria and S. Perera, "Parametric embodied carbon prediction model for early
stage estimating," Energy and Buildings, vol. 168, pp. 106-119, 2018, doi:
10.1016/j.enbuild.2018.02.044.
[6] J. H. Nguyen, "Carbon risk and firm performance: Evidence from a quasi-natural
experiment," Australian Journal of Management, vol. 43, no. 1, pp. 65-90, 2018, doi:
10.1177/0312896217709328.
[7] D. Jin, Z. Sun, Z. Sun, and Y. You, "The Impact of Carbon Emission Restriction Policy
on Small and Medium Enterprises."
[8] Y.-J. Ding, P.-C. Wu, and Y.-H. Lian, "Time Series Analysis for the Dynamic
Relationship between an Enterprise’s Business Growth and Carbon Emission in
Taiwan," Sustainability, vol. 12, no. 14, p. 5560, 2020.
[9] I. G. Alvarez, "Impact of CO2 emission variation on firm performance," Business
Strategy and the Environment, vol. 21, no. 7, pp. 435-454, 2012.
[10] C. Saka and T. Oshika, "Disclosure effects, carbon emissions and corporate value,"
Sustainability Accounting, Management and Policy Journal, vol. 5, no. 1, pp. 22-45,
2014.
[11] L. K. Se and JeonSungil, "The Effects of Carbon Emission Information on Firm
Value," Journal of Environmental Policy and Administration, 2019.
[12] H. Yan, X. Li, Y. Huang, and Y. Li, "The impact of the consistency of carbon
performance and carbon information disclosure on enterprise value," Finance
Research Letters, vol. 37, p. 101680, 2020.
[13] B. Choi and L. Luo, "Does the market value greenhouse gas emissions? Evidence
67
from multi-country firm data," The British Accounting Review, vol. 53, no. 1, p.
100909, 2021.
[14] E. F. M. Garcés, J. Herrera, G. Mafla, and A. Caiza, "Artificial Neuronal Networks to
Predict the Emissions of Carbon Dioxide (CO2) using a multilayer network with the
Levenberg-Marquadt training method," WSEAS Transactions on Environment and
Development, vol. 16, pp. 346-54, 2019.
[15] D. Lian, S. Q. Yang, W. Yang, M. Zhang, and W. R. Ran, "Carbon peaking prediction
scenarios based on different neural network models: A case study of Guizhou
Province," PloS one, vol. 19, no. 6, p. e0296596, 2024.
[16] W. Feng et al., "Application of Neural Networks on Carbon Emission Prediction: A
Systematic Review and Comparison," Energies, vol. 17, no. 7, p. 1628, 2024.
[17] G. Ji, "Application of BP neural network model in the prediction of China’s carbon
emissions based on grey correlation analysis," Math Practice Theory, vol. 44, no. 14,
pp. 243-249, 2014.
[18] J. Gou, X. Xiong, B. Yu, L. Du, Y. Zhan, and D. Tao, "Multi-target knowledge
distillation via student self-reflection," International Journal of Computer Vision, vol.
131, no. 7, pp. 1857-1874, 2023.
[19] X. Wu, Q. Yuan, C. Zhou, X. Chen, D. Xuan, and J. Song, "Carbon emissions
forecasting based on temporal graph transformer-based attentional neural network,"
Journal of Computational Methods in Science and Engineering, vol. 24, no. 3, pp.
1405-1421, 2024.
[20] P. Liu, G. Zhang, X. Zhang, and S. Cheng, "Carbon emissions modeling of china using
neural Network," in 2012 Fifth International Joint Conference on Computational
Sciences and Optimization, 2012: IEEE, pp. 679-682.
[21] 蘇以諾, "基於DS-NP-LSTM 神經網路之物聯冰水主機效能與碳排放預測," 碩士,
資訊工程系, 國立臺北科技大學, 台北市, 2023. [Online]. Available:
https://hdl.handle.net/11296/chfw8p
[22] 張曉妮, "應用CNN-LSTM 深度學習模型於預測台灣二氧化碳排放量," 碩士, 工
業工程與管理學系, 元智大學, 桃園縣, 2024. [Online]. Available:
https://hdl.handle.net/11296/b4rjpf
[23] S. Yan, Y. Zhang, H. Sun, and A. Wang, "A real-time operational carbon emission
prediction method for the early design stage of residential units based on a
convolutional neural network: A case study in Beijing, China," Journal of Building
Engineering, vol. 75, p. 106994, 2023.
[24] S. Bunsan, W.-Y. Chen, H.-W. Chen, Y. H. Chuang, and N. Grisdanurak, "Modeling
the dioxin emission of a municipal solid waste incinerator using neural networks,"
Chemosphere, vol. 92, no. 3, pp. 258-264, 2013.
[25] Y. Himeur et al., "Applications of knowledge distillation in remote sensing: A survey,"
68
Information Fusion, p. 102742, 2024.
[26] S. Abbasi, M. Hajabdollahi, N. Karimi, and S. Samavi, "Modeling teacher-student
techniques in deep neural networks for knowledge distillation," in 2020 International
Conference on Machine Vision and Image Processing (MVIP), 2020: IEEE, pp. 1-6.
[27] M. Wang, R. Liu, N. Hajime, A. Narishige, H. Uchida, and T. Matsunami, "Improved
knowledge distillation for training fast low resolution face recognition model," in
Proceedings of the IEEE/CVF International Conference on Computer Vision
Workshops, 2019, pp. 0-0.
[28] J. Gou, B. Yu, S. J. Maybank, and D. Tao, "Knowledge distillation: A survey,"
International Journal of Computer Vision, vol. 129, no. 6, pp. 1789-1819, 2021.
[29] M. Takrouri, N. M. Cuadrado, and M. Takáč, "Knowledge Distillation from Large
Language Models for Household Energy Modeling," arXiv preprint
arXiv:2502.03034, 2025.
[30] W. Jooste, A. Way, R. Haque, and R. Superbo, "Knowledge distillation for sustainable
neural machine translation," in Proceedings of the 15th Biennial Conference of the
Association for Machine Translation in the Americas (Volume 2: Users and Providers
Track and Government Track), 2022, pp. 221-230.
[31] L. Wang and K.-J. Yoon, "Knowledge distillation and student-teacher learning for
visual intelligence: A review and new outlooks," IEEE transactions on pattern
analysis and machine intelligence, vol. 44, no. 6, pp. 3048-3068, 2021.
[32] Y. Liu, C. Zou, and Y. Wang, "Distil Knowledge from Natural Language," in
Proceedings of the 8th International Conference on Computing and Artificial
Intelligence, 2022, pp. 181-186.
[33] M. Takamoto, Y. Morishita, and H. Imaoka, "An efficient method of training small
models for regression problems with knowledge distillation," in 2020 IEEE
Conference on Multimedia Information Processing and Retrieval (MIPR), 2020: IEEE,
pp. 67-72.
[34] K. Rafat et al., "Mitigating carbon footprint for knowledge distillation based deep
learning model compression," Plos one, vol. 18, no. 5, p. e0285668, 2023.
[35] X. Ding, Y. Wang, Z. Xu, Z. J. Wang, and W. J. Welch, "Distilling and transferring
knowledge via cGAN-generated samples for image classification and regression,"
Expert Systems with Applications, vol. 213, p. 119060, 2023.
[36] A. Jafari, M. Rezagholizadeh, and A. Ghodsi, "Improved knowledge distillation by
utilizing backward pass knowledge in neural networks," arXiv preprint
arXiv:2301.12006, 2023.
[37] J. Yi, J. Tao, Z. Wen, and Y. Li, "Distilling Knowledge from an Ensemble of Models
for Punctuation Prediction," presented at the Interspeech 2017, 2017.
[38] M. Kang and S. Kang, "Knowledge distillation with insufficient training data for
69
regression," Engineering applications of artificial intelligence., vol. 132, p. 108001,
2024, doi: 10.1016/j.engappai.2024.108001.
指導教授 陳以錚(Yi-Cheng Chen) 審核日期 2025-6-25
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明