博碩士論文 108323078 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:82 、訪客IP:52.90.227.42
姓名 黃鍾易(Chung-Yi Huang)  查詢紙本館藏   畢業系所 機械工程學系
論文名稱 將感測器信號和加工參數編碼成圖像用於雷射切割的遷移學習
(Images Encoded by Sensor Signals and Processing Parameters for Transfer Learning of Laser Cutting)
相關論文
★ 雙光子光致聚合微製造系統之研發★ 雙光子光致聚合五軸微製造系統之雷射加工路徑生成研究
★ 椎弓根螺釘定位演算法及導引夾治具自動化設計流程開發★ 雙光子聚合微製造技術以能量均勻橢圓體為基之曝光時間最佳化研究
★ 雙光子光致聚合微製造以弦高誤差為基之切層演算法★ 雙光子光致聚合微製造技術以螺旋線雷射掃描路徑增強微結構強度研究
★ 雙光子聚合微製造技術之三維結構 製造品質改進研究★ 利用二維多重圖像建構三維三角網格模型的生成與品質改進
★ 組織工程用冷凍成型製造系統 之自動化製作流程開發★ 自動相機校正與二維影像輪廓萃取研究
★ 基於雙光子光致聚合技術之四軸微製造系統製作高深寬比結構之研究★ 冷凍成型積層製造之機台設計與組織工程支架製作參數調校研究
★ 基於二維影像輪廓重建三維模型技術之多視角相機群組空間座標系統整合★ 應用於大型物體三維模型重建之多重二維校正板相機校正流程開發
★ 組織工程用冷凍成型積層製造之固態水支撐結構生成研究★ 聚醚醚酮之積層製造系統開發
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 (2026-12-31以後開放)
摘要(中) 近年來,雷射切割在工業上有很大的影響力,可透過更改雷射加工參數對不同材質切割來製作成品。但尋找參數與切口關係的過程中會消耗大量材料,尤其在切割貴重材料如藍寶石、鑽石等,會造成很大的損失。為了減少耗材,遷移學習是個可用的選項,利用便宜之耗材建立模型,再遷移至貴重材料模型。
基於卷積神經網路(Convolution Neural Network, CNN)在影像辨識上有重大成功與遷移學習的先例,將雷射功率感測器信號與其他加工參數轉換為圖像來建立資料集。本研究使用光纖雷射切割機切割較便宜的不鏽鋼與較貴重的矽鋼片,首先建立不鏽鋼的訓練模型,再利用遷移學習訓練矽鋼片訓練模型,節省大量學習參數的同時還可節省矽鋼片的材料浪費。利用CNN建構出一個切割品質的預測模型,此模型以轉換的圖像資料集當作輸入,由神經網路推導出的雷射切口平均寬度與標準差。在神經網路部分,使用了3種架構來驗證出適合的模型,並在架構之中套用了已訓練好的模型架構(VGG16、ResNet50、GoogleNet)來做比較。
本實驗使用了24組不同的切割參數組合,實際切割出120條不鏽鋼切割線與48條矽鋼片切割線,在類神經網路的訓練部分,很好的預測出此24組組合的切口品質,驗證資料的均方誤差約為0.0263,並且在遷移模型上也得到了不錯的效果,驗證資料的均方誤差約為0.0162。經過上述的實驗,由本實驗所建立的雷射切口寬度的預測模型,可以有效地在不同的材質上進行遷移預測,可以減少新材料在實驗中的浪費,降低開銷。
摘要(英) Recently, laser has a great influence in the industry. It can produce products through different materials by different parameters. Searching for the best parameters often consumes a lot of materials and increases a lot of cost too much when cutting expensive materials like diamond, sapphire. In order to save the cost, transfer learning is an available option to use cheap material to train a model and then transfer to expensive material model.
According to CNN has significant success in images recognition and precedents for transfer learning, we encode the power, repetition rate and cutting speed into images as a dataset. In this study, an optical fiber laser cutting machine was used to cut stainless steel and silicon steel sheets, the stainless steel is cheaper than silicon. First, train a stainless steel training model, and transfer the stainless steel model to the silicon steel training model through transfer learning, which can reduce the training data and the silicon steel sheets. We use the CNN to construct a cutting quality prediction model. This model uses the images dataset as the input, and derives the average width of the cutting kerf and the standard deviation. In the neural network part, three kinds of structures are used to verify suitable models for this case, and the currently pretrained models on the market (VGG16, ResNet50, GoogleNet) are applied to these structure to accelerate the process of calculation.
This experiment uses 24 sets of different cutting parameter combinations, and actually cuts 120 stainless steel cutting lines and 48 silicon steel sheets cutting lines. In the training part of the neural network, the cutting quality of the 24 sets of combinations is well predicted, and the mean square error of the verification data is about 0.0263. We also got good results on the transfer learning mode, the mean square error of the verification data is about 0.0162l. After the above experiments, the prediction model of the laser kerf width established by this experiment can effectively predict on the different materials, it also can reduce the waste of new materials in the experiment, and reduce the cost.
關鍵字(中) ★ 雷射切割
★ 切口寬度
★ 深度學習
★ 遷移學習
★ 卷積神經網路
關鍵字(英) ★ Laser Cutting
★ Kerf Width
★ Deep Learning
★ Transfer Learning
★ Convolutional Neural Network
論文目次 摘要 I
Abstract II
致謝 III
圖目錄 VI
表目錄 IX
第一章 緒論 1
1-1 前言 1
1-2文獻回顧 2
1-3研究動機與目的 8
1-4論文架構 10
第二章 理論說明 11
2-1雷射基礎原理 11
2-2影像處理程序 12
2-3數據圖像化 16
2-4類神經網路 19
2-5深度學習 28
第三章 研究方法 37
3-1 實驗設備與流程 37
3-2 材料之選擇 40
3-3 雷射參數之數據收集與處理 42
3-4雷射切口之寬度量測流程 45
3-5神經網路模型之架構 50
第四章 實驗結果與討論 56
4-1數據分析與圖像轉換結果 56
4-2切口寬度計算結果與驗證 57
4-3學習數據及結果評估 59
第五章 結論與未來展望 76
5-1結論 76
5-2未來展望 76
參考文獻 78
參考文獻 [1]A. Belhadj, P. Baudouin, F. Breaban, A. Deffontaine, M. Dewulf and Y. Houbaert, “Effect of laser cutting on microstructure and on magnetic properties of grain non-oriented electrical steels,” J. Magn. Magn. Mater., vol. 256, no. 1-3, pp. 20-31, 2003.
[2]G. Loisos and A.J. Moses, “Effect of mechanical and Nd:YAG laser cutting on magnetic flux distribution near the cut edge of non-oriented steels,” J. Mater. Process. Technol., vol. 161, no. 1-2, pp. 151-155, 2005.
[3]S. Russell and P. Norvig, “Artificial intelligence: a modern approach,” New York, USA: Prentice Hall, 2009.
[4]M. Radovanović and M. Madić, “Experimental investigations of CO2 laser cut quality : a review,” Nonconvent. Technol. Rev., vol. 15, no. 4, pp. 35-42, 2011.
[5]黃立仁與羅慶璋,「利用二氧化碳雷射切割S304品質評估」,銲接與切割,11:1卷,頁38-48,2001。
[6]N. Rajaram, J. Sheikh-Ahmad and S.H. Cheraghi, “CO2 laser cut quality of 4130 steel,” Int. J. of Mach. Tool Manuf., vol. 43, no. 4, pp. 354-358, 2003.
[7]B.S. Yilbas, “Laser cutting quality assessment and thermal efficiency analysis.” J. Mater. Process. Technol., vol. 155-156, pp. 2106-2115, 2004.
[8]K. AbdelGhany and M. Newishy, “Cutting of 1.2 mm thick austenitic stainless steel sheet using pulsed and CW Nd:YAG laser,” J. Mater. Process. Technol., vol. 168, no. 3, pp. 438-447, 2005.
[9]S. Bayraktar and Y. Turgut, “Effects of different cutting methods for electrical steel sheets on performance of induction motors,” Proc. Inst. Mech. Eng., Part B, vol. 232, no. 7, pp. 1287-1294, 2016.
[10]T.H. Nguyen, C.K. Lin, P.C. Tung, N.V. Cuong and J.R. Ho, “An extreme learning machine for predicting kerf waviness and heat affected zone in pulsed laser cutting of thin non-oriented silicon steel,” Opt. Lasers Eng., vol. 134, no. 106244, 2020.
[11]C.L. Liu, W.H Hsaio and Y.C. Tu, “Time series classification with multivariate convolutional neural network,” IEEE Trans. Ind. Electron., 2019.
[12]Z. Wang and T. Oates, “Encoding time series as images for visual inspection and classification using tiled convolutional neural networks,” Twenty-Ninth AAAI Conf. Artif. Intell., 2015.
[13]N. Hatami, Y. Gavet and J. Debayle, “Classification of time-series images using deep convolutional neural networks,” France, 2017.
[14]R. Ke, W. Li, Z. Cui and Y. Wang, “Two-stream multi-channel convolutional neural network (TM-CNN) for multi-lane traffic speed prediction considering traffic volume impact,” J. Transp. Res. Rec., vol. 2674, no. 4, 2020.
[15]T. Zan, H. Wang, M Wang, Z.H. Liu and X.S. Gao, “Application of multi-dimension input convolutional neural network in fault diagnosis of rolling bearings,” Appl. Sci., vol. 9, no. 13, pp. 2690, 2019.
[16]呂助增,「雷射原理及應用」,聯經出版社,台北,1987。
[17]吳成柯,「數位影像處理」,儒林圖書,台北,1995。
[18]M. Stokes, M. Anderson, S. Chandrasekar and R. Motta, “A standard default color space for the internet – sRGB,” IEC, 1996.
[19]鐘國亮,「影像處理與電腦視覺」,東華出版社,台北,2015。
[20]G. Bradski and A. Keahler, “Learning OpenCV:computer vision with the OpenCV library,” Sebastopol, California, USA: O’Reilly, 2008.
[21]曾竣煌,「熔融沉積成型技術之路徑規劃與提升製造效率研究」,碩士論文,國立中央大學機械工程學系,2018。
[22]D.H. Douglas and T. Peucker, “Algorithms for the reduction of the number of points required to represent a digitized line or its caricature,” Cartographica: The International Journal for Geographic Information and Geovisualization, vol. 10, pp.112-122, 1973.
[23]Z. Wang and T. Oates, “Imaging time-series to improve classification and imputation,” IJCAI, 2015.
[24]J.P. Eckmann, S.O. Kamphorst and D. Ruelle, “Recurrence plots of dynamical systems,” World Scientific Series on Nonlinear Science Series A, vol. 16, pp. 441-446, 1995.
[25]C. Nwankpa, W. Ijomah, A. Gachagan and S. Marshall, “Activation functions: comparison of trends in practice and research for deep learning,” Proc. IEEE Conf. Comput. Vis. Pattern. Recognit., 2018.
[26]E. Jang, S. Gu and B. Poole, “Categorical reparameterization with gumbel-softmax,” In Proc. Int. Conf. Learn. Rep, 2017.
[27]G.E. Hinton and R.R. Salakhutdinov, “Reducing the dimensionality of data with neural networks,” Science, 2006.
[28]Y. Lecun, L. Bottou, Y. Bengio and P. Haffner, “Gradient-based learning applied to document recognition,” Proc. IEEE, vol. 86, no. 11, pp. 2278-2324, 1998.
[29]A. Krizhevsky, I. Sutskever and G.E. Hinton, “ImageNet classification with deep convolutional neural networks,” Commun. ACM, vol. 60, no. 6, pp.84-90, 2017.
[30]K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale image recognition,” In Int. Conf. Learn. Rep., 2014.
[31]C. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, D. Erhan, V. Vanhoucke and A. Rabinovich, “Going deeper with convolutions,” Proc. IEEE Conf. Comput. Vis. Pattern. Recognit., pp.1-9, 2014.
[32]K. He, X. Zhang, S. Ren and J. Sun, “Deep residual learning for image recognition,” Proc. IEEE Conf. Comput. Vis. Pattern. Recognit., pp. 770-778, 2016.
[33]PyTorch官方網站,取自https://pytorch.org
[34]IPG Photonics官方網站,取自https://www.ipgphotonics.com/en
[35]H. Tuthill, “Comprehensive information about metallurgy of stainless steel,” Food and Environmental Sanitation, 2005.
[36]SciPy官方網站,取自https://www.scipy.org/
[37]Pyts官方網站,取自https://pyts.readthedocs.io/
[38]ZEISS官方網站,取自https://www.micro-shop.zeiss.com/en/de/shop/
objectives/421031-9910-000/Objective-A-Plan-5x-0.12-Ph0-M27。
[39]OpenCV官方網站,取自http://opencv.org
指導教授 廖昭仰(Chao-Yaug LIAO) 審核日期 2021-10-22
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明