參考文獻 |
[1] 台灣電力股份有限公司,「歷年發購電量及結構」,引見於5月24,2021. [線上]. Available: https://reurl.cc/ar52bD
[2] N. Paulauskas and A. Baskys, “Application of Histogram-Based Outlier Scores to Detect Computer Network Anomalies,” Electronics, vol. 8, p. 1251, Nov. 2019.
[3] H. Ye, H. Kitagawa, and J. Xiao, “Continuous Angle-based Outlier Detection on High-dimensional Data Streams,” in 19th International Database Engineering & Applications Symposium, New York, NY, USA, Jul. 2015, pp. 162–167.
[4] I. Ullah, H. Hussain, I. Ali, and A. Liaquat, “Churn Prediction in Banking System Using K-means, LOF, and CBLOF,” in 2019 International Conference on Electrical, Communication, and Computer Engineering, Jul. 2019, pp. 1–6.
[5] A. Likas, N. Vlassis, and J. J. Verbeek, “The Global K-means Clustering Algorithm,” Pattern recognition, vol. 36, no. 2, pp. 451–461, 2003.
[6] Z. He, X. Xu, and S. Deng, “Discovering Cluster-Based Local Outliers,” Pattern Recognition Letters, vol. 24, no. 9, pp. 1641–1650, Jun. 2003.
[7] Z. Cheng, C. Zou, and J. Dong, “Outlier Detection Using Isolation Forest and Local Outlier Factor,” in Conference on Research in Adaptive and Convergent Systems, New York, NY, USA, Sep. 2019, pp. 161–168.
[8] Z. Li, Y. Zhao, N. Botta, C. Ionescu, and X. Hu, “COPOD: Copula-Based Outlier Detection,” arXiv:2009.09463 [cs, stat], Sep. 2020.
[9] T. T. Dang, H. Y. T. Ngan, and W. Liu, “Distance-Based K-Nearest Neighbors Outlier Detection Method in Large-Scale Traffic Data,” in 2015 IEEE International Conference on Digital Signal Processing, Jul. 2015, pp. 507–510.
[10] Y. Chen, D. Miao, and H. Zhang, “Neighborhood Outlier Detection,” Expert Systems with Applications, vol. 37, no. 12, pp. 8745–8749, Dec. 2010.
[11] H. Wang, M. J. Bah, and M. Hammad, “Progress in Outlier Detection Techniques: A Survey,” IEEE Access, vol. 7, pp. 107964–108000, 2019.
[12] Y. Hu, H. Chen, G. Li, H. Li, R. Xu, and J. Li, “A Statistical Training Data Cleaning Strategy for the PCA-Based Chiller Sensor Fault Detection, Diagnosis and Data Reconstruction Method,” Energy and Buildings, vol. 112, pp. 270–278, Jan. 2016.
[13] F. W. Yu, W. T. Ho, K. T. Chan, and R. K. Y. Sit, “Critique of Operating Variables Importance on Chiller Energy Performance Using Random Forest,” Energy and Buildings, vol. 139, pp. 653–663, 2017.
[14] O. Renaud and M. P. Victoria-Feser, “A Robust Coefficient of Determination for Regression,” Journal of Statistical Planning and Inference, vol. 140, no. 7, pp. 1852–1862, 2010.
[15] Y. Fan, X. Cui, H. Han, and H. Lu, “Feasibility and Improvement of Fault Detection and Diagnosis Based On Factory-Installed Sensors for Chillers,” Applied Thermal Engineering, vol. 164, p. 114506, Jan. 2020.
[16] S. Arlot and A. Celisse, “A Survey of Cross-Validation Procedures for Model Selection,” Statistics surveys, vol. 4, pp. 40–79, 2010.
[17] W. S. Noble, “What Is a Support Vector Machine?,” Nature biotechnology, vol. 24, no. 12, pp. 1565–1567, 2006.
[18] H. Han, B. Gu, J. Kang, and Z. R. Li, “Study on a Hybrid SVM Model for Chiller FDD Applications,” Applied Thermal Engineering, vol. 31, no. 4, pp. 582–592, Mar. 2011.
[19] C.-F. Chien et al., “AI and Big Data Analytics for Wafer Fab Energy Saving and Chiller Optimization to Empower Intelligent Manufacturing,” in 2018 e-Manufacturing Design Collaboration Symposium, Sep. 2018, pp. 1–4.
[20] S. Zhang, X. Zhu, B. Anduv, X. Jin, and Z. Du, “Fault Detection and Diagnosis for the Screw Chillers Using Multi-Region XGBoost Model,” Science and Technology for the Built Environment, vol. 27, no. 5, pp. 608–623, May 2021.
[21] T. Chen and C. Guestrin, “XGBoost: A Scalable Tree Boosting System,” in 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 785–794, Aug. 2016.
[22] G. Ke et al., “Lightgbm: A Highly Efficient Gradient Boosting Decision Tree,” Advances in neural information processing systems, vol. 30, pp. 3146–3154, 2017.
[23] J. Snoek, H. Larochelle, and R. P. Adams, “Practical Bayesian Optimization of Machine Learning Algorithms,” arXiv:1206.2944 [cs, stat], Aug. 2012.
[24] K. B. Abou Omar, “XGBoost and LGBM for Porto Seguro’s Kaggle Challenge: A Comparison,” Preprint Semester Project, 2018.
[25] T. Laharika, V. Ksk, M. Sushruta, M. M. Kumar, and S. Saurabh, “Invoice Deduction Classification Using LGBM Prediction Model,” Lecture Notes in Electrical Engineering, vol. 709, pp. 127–137, 2021.
[26] M. Massaoudi, S. S. Refaat, I. Chihi, M. Trabelsi, F. S. Oueslati, and H. Abu-Rub, “A Novel Stacked Generalization Ensemble-Based Hybrid LGBM-XGB-MLP Model for Short-Term Load Forecasting,” Energy, vol. 214, p. 118874, Jan. 2021.
[27] X. Dong, Z. Yu, W. Cao, Y. Shi, and Q. Ma, “A Survey on Ensemble Learning,” Frontiers of Computer Science, vol. 14, no. 2, pp. 241–258, Apr. 2020.
[28] S. Lim and S. Chi, “XGBoost Application on Bridge Management Systems for Proactive Damage Estimation,” Advanced Engineering Informatics, vol. 41, p. 100922, Aug. 2019.
[29] S. M. Lundberg and S.-I. Lee, “A Unified Approach to Interpreting Model Predictions,” Advances in Neural Information Processing Systems, vol. 30, pp. 4765–4774, 2017.
[30] K. E. Mokhtari, B. P. Higdon, and A. Başar, “Interpreting Financial Time Series with SHAP Values,” in 29th Annual International Conference on Computer Science and Software Engineering, USA, Nov. 2019, pp. 166–172.
[31] L. E. Peterson, “K-Nearest Neighbor,” Scholarpedia, vol. 4, no. 2, p. 1883, 2009.
[32] X. Yu, S. Ergan, and G. Dedemen, “A Data-Driven Approach to Extract Operational Signatures of HVAC Systems and Analyze Impact on Electricity Consumption,” Applied Energy, vol. 253, p. 113497, Nov. 2019.
[33] 黃仲翊,「外部輸入非線性自動迴歸模型應用於冰水主機耗能分析」,碩士論文,冷凍空調工程系所,國立臺北科技大學,臺北市,2017。
[34] J.-H. Kim, N.-C. Seong, and W. Choi, “Modeling and Optimizing a Chiller System Using a Machine Learning Algorithm,” Energies, vol. 12, no. 15, Art. no. 15, Jan. 2019.
[35] S. Qiu, Z. Li, Z. Li, and X. Zhang, “Model-Free Optimal Chiller Loading Method Based on Q-Learning,” Science and Technology for the Built Environment, vol. 26, no. 8, pp. 1100–1116, Sep. 2020.
[36] R. M. Schmidt, F. Schneider, and P. Hennig, “Descending Through a Crowded Valley -- Benchmarking Deep Learning Optimizers,” arXiv:2007.01547 [cs, stat], Feb. 2021.
[37] F. Acerbi, M. Rampazzo, and G. Nicolao, “An Exact Algorithm for the Optimal Chiller Loading Problem and Its Application to the Optimal Chiller Sequencing Problem,” Energies, vol. 13, Dec. 2020.
[38] 鄧翔運,(王文俊指導)「基於機器學習之織布定型機與冰機節能分析與肇因診斷」,碩士論文,電機工程學系,國立中央大學,桃園市,2020。
[39] 陳輝俊,「空調節能技術與能源管理」,台電空調技術運用研討會,2016。
[40] T. Hartman, “All-Variable Speed Centrifugal Chiller Plants,” ASHRAE Journal, vol. 56, no. 6, pp. 68–79, 2014.
[41] Y. Zhao, Z. Nasrullah, and Z. Li, “PyOD: A Python Toolbox for Scalable Outlier Detection,” Journal of Machine Learning Research, vol. 20, pp. 1–7, May 2019.
[42] G. Varoquaux, L. Buitinck, G. Louppe, O. Grisel, F. Pedregosa, and A. Mueller, “Scikit-learn: Machine Learning Without Learning the Machinery,” GetMobile: Mobile Computing and Communications, vol. 19, no. 1, pp. 29–33, Jun. 2015.
[43] M. J. Azur, E. A. Stuart, C. Frangakis, and P. J. Leaf, “Multiple Imputation by Chained Equations: What Is It and How Does It Work?,” International Journal of Methods in Psychiatric Research, vol. 20, no. 1, pp. 40-49, 2011.
[44] J. Neyman, “On the Two Different Aspects of the Representative Method: The Method of Stratified Sampling and the Method of Purposive Selection,” in Breakthroughs in Statistics: Methodology and Distribution, S. Kotz and N. L. Johnson, Eds. New York, NY: Springer, 1992, pp. 123–150.
[45] T. Pevný, “Loda: Lightweight On-line Detector of Anomalies,” Machine Learning, vol. 102, no. 2, pp. 275–304, Feb. 2016.
[46] “Bayesian optimization with skopt — scikit-optimize 0.8.1 documentation.” https://scikit-optimize.github.io/stable/auto_examples/bayesian-optimization.html (accessed Jun. 02, 2021).
[47] T. M. Cover, “Hypothesis Testing with Finite Statistics,” the Annals of Mathematical Statistics, vol. 40, no. 3, pp. 828–835, 1969.
[48] A. Ross and V. L. Willson, “Paired Samples T-Test,” in Basic and Advanced Statistical Tests: Writing Results Sections and Creating Tables and Figures, A. Ross and V. L. Willson, Eds. Rotterdam: SensePublishers, 2017, pp. 17–19.
[49] M. Rosenblatt, “A Central Limit Theorem and a Strong Mixing Condition,” Proceedings of the National Academy of Sciences of the United States of America, vol. 42, no. 1, pp. 43–47, Jan. 1956.
[50] D. Pelleg and A. Moore, “X-means: Extending K-means with Efficient Estimation of the Number of Clusters,” in 17th International Conference on Machine Learning, 2000, pp. 727–734.
[51] A. Vince, “A Framework for the Greedy Algorithm,” Discrete Applied Mathematics, vol. 121, no. 1–3, pp. 247–260, 2002. |