博碩士論文 107426028 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:65 、訪客IP:3.138.116.17
姓名 蘇翊甄(Yi-Jhen Su)  查詢紙本館藏   畢業系所 工業管理研究所
論文名稱 基於 LSTM/GRU 於塗佈機之異常偵測
(Anomaly Detection of the Coating Machine base on LSTM/GRU Approaches)
相關論文
★ 應用灰色理論於有機農產品之經營管理— 需求預測及關鍵成功因素探討★ NAND型Flash價格與交運量預測在風險分析下之決策模式
★ 工業電腦用無鉛晶片組最適存貨政策之研究-以A公司為例★ 砷化鎵代工廠磊晶之最適存貨管理-以W公司為例
★ 資訊分享&決策制定下產銷協同關係之研究 -以IC設計業為例★ 應用分析層級法於電子化學品業委外供應商評選準則之研究
★ 應用資料探勘於汽車售服零件庫存滯銷因素分析-以C公司為例★ 多目標規劃最佳六標準差水準: 以薄膜電晶體液晶顯示器C公司製造流程為例
★ 以資料探勘技術進行消費者返廠定期保養之實證研究★ 以價值鏈觀點探討品牌公司關鍵組織流程之取決-以S公司為例
★ 應用產銷協同規劃之流程改善於化纖產業-現況改善與效益分析★ 權力模式與合作關係對於報價策略之影響研究—以半導體產業A公司為例
★ 應用資料探勘於汽車製造業之庫存原因分析★ 以類神經網路預測代工費報價---以中小面板產業C公司為例
★ 電路板產業存貨改善研究-以N公司為例★ 運用六標準差改善機台備用零件(Spare parts)存貨管理
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 ( 永不開放)
摘要(中) 近年人工智慧的快速發展,促使了工業 4.0 的演進,現今大多數工廠都已漸漸導入智慧製造系統,也代表著機器的穩定度對工廠生產是相當的重要。以往工廠對設備的維護為修復性維護或預防性維護,此兩種方式對於維護成本是相當大的,且可能造成設備在無預警的狀況下發生異常停止,為了不讓設備在無預警的狀況下發生異常,近年來已漸漸發展為預測性維護。基於前述問題,本研究主要動機為期望在設備發生故障前能夠準確偵測到異常,發出警訊提醒設備人員以進行設備維護,目的為提早預防機台異常的發生而導致停止生產之狀況。
本研究所使用數據為 A 公司所提供之塗佈機感測器數據,分別使用長短期記憶網路 (Long short-term memory, LSTM) 模型以及門控循環單元 (Gated Recurrent Unit, GRU) 模型的監督式學習來建立機台異常偵測系統,並以多對多的滑動窗口方法來進行模型的訓練及預測,加速模型訓練速度及降低計算的複雜度,以及針對不同模型做超參數的調整,再利用評價指標方法找出最佳的模型配置。
摘要(英) In recent years, rapid development of artificial intelligence has promoted the evolution of Industry 4.0. Nowadays, most factories have gradually introduced smart manufacturing system, which also represents the stability with the machine is important for factory capacity. In the past, the maintenance of equipment in the factory was corrective maintenance or preventive maintenance. These two methods are quite expensive for maintenance and it may cause the equipment to shut down abnormally without any warning. In order to prevent the equipment to shut down without warning, predictive maintenance has gradually developed in recent years.
As stated above problems, the motivation is to expect to detected the anomaly before the machine shut down, and alert equipment engineer to maintenance of equipment. We will use the A company’s coating machine sensor data to analysis the anomaly detection.
We will use supervised learning method base on the Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) model to build anomaly detection system, and using slide window to speed up the training time and reduce the computational complexity. We experiment the various Hyperparameter for the different model, and using confusion matrix for performance evaluation the model to get the most suitable Hyperparameter to optimize the model.
關鍵字(中) ★ 深度學習
★ 時間序列
★ 長短期記憶網路
★ 門控循環單元
★ 異常偵測
★ 預測性維護
關鍵字(英) ★ Deep Learning
★ Time Series
★ Long Short-Term Memory
★ Gated Recurrent Unit
★ Anomaly Detection
★ Predictive Maintenance
論文目次 中文摘要 I
Abstract II
目錄 III
圖目錄 V
表目錄 VII
一、 緒論 1
1-1 研究背景與動機 1
1-2 研究的 2
1-3 研究構 3
二、 文獻探討 4
2-1 深度學習4
2-2 長短期記憶網路(LSTM) 6
2-3 激勵函數 (Activation Function) 9
2-4 門控循環單元 (GRU) 12
2-5 評價指標 (Evaluation) 13
三、 研究方法 15
3-1 問題定義 15
3-2 資料處理 16
3-3 模型設計 19
3-3-1 特徵標準化 (Feature Scaling) 21 VI
3-3-2 神經網路架構 22
3-3-3 損失函數 (Loss Function) 25
3-3-4 優化器 (Optimizer) 25
四、 實驗與分析 27
4-1 實驗環境與開發工具 27
4-2 實驗分析 28
4-2-1 資料集說明 28
4-2-2 實驗設計 29
4-2-3 實驗結果 31
五、 結論與未來展望 41
參考文獻 43
參考文獻 1. Ang, J. C., Mirzal, A., Haron, H., Hamed, H. N. A. (2016). Supervised, unsupervised, and semi-supervised feature selection: a review on gene selection. IEEE/ACM Transactions on Computational Biology and Bioinformatics, Vol. 13, no. 5, pp. 971–989.
2. Cachada, A., Barbosa, J., Leitño, P., Geraldcs, C., Deusdado, L., Costa, J., Teixeira, C., Teixeira, J., Moreira, A., Moreira, P., Romero, L. (2018). Maintenance 4.0: Intelligent and Predictive Maintenance System Architecture. IEEE 23rd International Conference on Emerging Technologies and Factory Automation (ETFA), Vol. 1, pp. 139–146.
3. Chandola, V., Banerjee, A., Kumar, V. (2009). Anomaly detection: A survey,” ACM computing surveys., Vol. 41, no. 3, pp. 15:1–15:58.
4. Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., Bengio, Y. (2014). Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. Proceedings of the Empiricial Methods in Natural Language Processing (EMNLP 2014).
5. Davis, J., Goadrich, M. (2006). The relationship between Precision-Recall and ROC curves. Proceedings of the 23rd international conference on Machine learning, pp. 233–240.
6. Deng, L. Yu, D. (2014). Deep Learning: Methods and Applications. Foundations and Trends in Signal Processing, Vol. 7, no. 3–4, pp. 197–387.
7. Di Persio, L., Honchar, O. (2017). Recurrent neural networks approach to the financial forecast of google assets. International Journal of Mathematics and Computers in Simulation, Vol. 11, pp. 7–13.
8. Ergen, T., Mirza, A. H., Kozat, S. S. (2017). Unsupervised and Semi-supervised Anomaly Detection with LSTM Neural Networks. Vol. 1, arXiv:1710.09207.
9. Fu, R., Zhang, Z., Li, L. (2016). Using LSTM and GRU Neural Network Methods for Traffic Flow Prediction. IEEE Youth Academic Annual Conference of Chinese Association of Automation (YAC), pp. 324–328.
10. Goodfellow, I., Bengio, Y., Courville, A. (2016). Deep Learning. MIT Press.
11. Graves, A. (2014). Generating sequences with recurrent neural networks.
arXiv:1308.0850v5.
12. Guo, Y., Liao, W., Wang, Q, Yu, L., Ji, T., Li, P. (2018). Multidimensional Time Series Anomaly Detection: A GRU-based Gaussian Mixture Variational Autoencoder Approach. Asian Conference on Machine Learning, pp. 97–112.
13. Hochreiter, S., Schmidhuber, J. (1997). Long Short-Term Memory. Neural Computation, Vol. 9, pp. 1735–1780.
14. Jiao, R., Zhang, T., Jiang, Y., He, H. (2018). Short-Term Non-Residential Load Forecasting Based on Multiple Sequences LSTM Recurrent Neural Network. IEEE Access, Vol. 6, pp. 59438–59448.
15. Kingma, P. D., Ba, J. (2014). Adam: A Method for Stochastic Optimization. International Conference on Learning Representations.
16. Lasi, H., Fettke, P., Kemper, H. G., Feld, T., Hoffmanne, M., (2014). Industry 4.0. Business & Information Systems Engineering, Vol. 6, pp. 239.
17. LeCun, Y., Bottou L., Bengio Y., Haffner P., (1998). Gradient-Based Learning Applied to Document Recognition. Proceedings of the IEEE, Vol. 86, no. 11, pp. 2278–2324.
18. Li, S., Xie, Y., Farajtabar, M., Song, L. (2016). Detecting weak changes in dynamic events over networks. arXiv:1603.08981v2.
19. Malhotra, P., Vig, L., Shroff, G., Agarwal, P. (2015). Long Short Term Memory
Networks for Anomaly Detection in Time Series. European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Vol. 1, pp. 89–94.
20. Nielsen, M. (2015). Neural Networks and Deep Learning. Determination Press.
21. Nwankpa, C. E., Ijomah, W., Gachagan, A., Marshall, S. (2018). Activation Functions: Comparison of Trends in Practice and Research for Deep Learning. arXiv:1811.03378v1.
22. Olah, C., Understanding LSTM networks. (2015).
Available from < http://colah.github.io/posts/2015-08-Understanding-LSTMs/ >
23. Pascanu, R., Mikolov, T., Bengio, Y. (2013) On the difficulty of training recurrent neural networks. Proceedings of International Conference on Machine Learning (ICML), pp. 1310–1318.
24. Raúl Gómez blog. (2018). Understanding Categorical Cross-Entropy Loss, Binary Cross-Entropy Loss, Softmax Loss, Logistic Loss, Focal Loss and all those confusing names.
Available from < https://gombru.github.io/2018/05/23/cross_entropy_loss/ >
25. Stojanovic, L., Dinic, M., Stojanovic, N., Stojadinovic, A. (2016). Big-data- driven anomaly detection in industry (4.0): an approach and a case study. IEEE International Conference on Big Data, pp. 1647–1652.
26. Sutskever, I., Martens, J., Dahl, G., Hinton, G. (2013). On the importance of initialization and momentum in deep learning. Proceedings of the 30th International Conference on Machine Learning (ICML-13), Vol. 28, pp. 1139– 1147.
27. Tieleman, T., Hinton, G. (2012). Lecture 6.5 - rmsprop: Divide the gradient by a running average of its recent magnitude. COURSERA: Neural networks for machine learning, Vol. 4, no. 2, pp. 26–31.
28. Zhang, A., Lipton, Z. C., Li, M. and Smola, A. J. (2020). Dive into Deep Learning. Available from < https://d2l.ai/ >
29. Zhao, H., Sun, S., Jin, B. (2018). Sequential Fault Diagnosis based on LSTM
Neural Network. IEEE Access, Vol. 6, pp. 12929–12939. 45
指導教授 陳振明(Jen-Ming Chen) 審核日期 2020-7-2
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明