博碩士論文 106226037 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:15 、訪客IP:3.145.78.117
姓名 王瀚賢(WANG,HAN-HSIEN)  查詢紙本館藏   畢業系所 光電科學與工程學系
論文名稱 混合二進制編碼及相位位移法之高景深三維重建系統
(Hybrid 3D Profilometry with Extended Measurement Depth Based on Binary Code and Phase Shift)
相關論文
★ 非反掃描式平行接收之雙光子螢光超光譜顯微術★ 以二次通過成像量測架構及降低誤差迭代演算法重建人眼之點擴散函數
★ LASER光源暨LED在老鼠毛生長的低能量光治療比較分析★ 應用線狀結構照明提升雙光子顯微鏡解析度
★ 以同調結構照明顯微術進行散射樣本解析度之提升★ 掃描式二倍頻結構照明顯微術
★ 小貓自泵相位共軛鏡於數位光學相位共軛與時間微分之研究★ 鏡像輔助斷層掃描相位顯微鏡
★ 以數位全像術重建多波長環狀光束之研究★ 相位共軛反射鏡用於散射介質中光學聚焦之研究
★ 雙光子螢光超光譜顯微術於多螢光生物樣本之研究★ 倍頻非螢光基態耗損超解析之顯微成像方法
★ 葉綠素雙光子螢光超光譜影像於光合作用研究之應用★ 雙光子掃描結構照明顯微術
★ 微投影光學切片超光譜顯微術★ 使用結構照明顯微術觀察活體小鼠毛囊生長週期之變化
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 (2025-1-16以後開放)
摘要(中) 從最早的針孔相機開始,到現在的單眼相機,人類就想把生活周遭的物品、事件記錄下來。然而相機只能記錄二維的影像,始終不能把真實世界中的三維資訊記錄下來,所以我們需要建立三維模型的工具以獲得真實世界中三維資訊。
三維模型已經廣泛用在各領域,如在自駕車、掃地機器人上用來測量距離與前方障礙物,在電影拍攝時動態捕捉演員的臉部變化以便於後製動畫的處理,在醫療上牙齒和各個器官的建模可用於觀察病人的健康狀況或建立精確的模型,因此,如何把真實物體的三維資訊更精確的記錄下來成了重要的課題。
現在三維建模多種不同的技術中,以結構光為基礎的技術通常會遇到景深不足、視角造成影像形變的問題,本研究同樣以結構光為基礎,希望能夠改善這兩項問題,此外在沒有參考平面的情況下量測,並提升系統的可移動性。為了增加系統的可移動性,本研究結合微型投影機與手機,以微型投影機提供結構光,再以手機拍攝結構光影像。過去在結構光系統中最常使用是三相位位移法取得相位差後再重建影像,此方法容易在大視角的位置造成形變,所以本研究中結合二進制編碼與相位位移法,二進制編碼是要精準定位樣本上的條紋,找出其對應到基準面上的位置後便可清楚得知樣本對應到相機的視角,並使用三角量測修正在此視角下的像差。因為相機與樣本相對視角的計算可以使用先行紀錄的二進制編碼條紋,所以不需要基準面的資訊也可重建三維資訊。結合相位位移法則是為了進一步在得知條紋位置後輔助獲得更細緻的相位資訊,以取得更完整的三維資訊。
本研究在系統建構完成後以平板、球類及人臉面具做為測試樣本,以平板為樣本是為了測試重建絕對高度的精準度,將平板放置在距離基準面20 cm到60 cm的位置內,重建的誤差在0.4 cm以內;以桌球和接觸球作為已知的樣本,測試重建簡單幾何曲面的精準度,以圓形擬合與球形擬合找出半徑,同樣將桌球和接觸球放置在距離基準面20 cm到60 cm的位置內,桌球與接觸球相對高度誤差分別在0.04 cm以內、0.06 cm以內;最後以人臉面具證明可以重建較複雜結構的三維資訊。
摘要(英) From the earliest pinhole camera to the Monocular camera, humans want to record the objects and events.However, the camera can only record two-dimensional (2D) images, and can never record the three-dimensional(3D) information in the real world. So we need instruments to build 3D models to preliminary 3D information in the real world.
3D models have been widely used in various fields. Self-driving and Robotic vacuum cleaner measure distance and obstacles in front. The dynamic capture of the human face during film shooting, recording the actor′s face changes to facilitate the post processing. Modeling of the teeth and various organs in medical treatment can be used to observe the health of the patient or to establish an accurate model. How to record real items more accurately becomes another topic.
Nowadays, among many different technologies from 3D modeling, the 3D modeling based on structured light usually faces the problems of lacking depth of field and image distortion caused by angle of view. This study is also based on structured light and hoped these two problems can be improved. In addition, the measurement is performed without a reference plane and the system′s mobility is improved. This study combines a micro-projector with a mobile phone to provide structured light with a micro-projector, and then capture a structured-light image with a mobile phone. In the past, the most commonly algorithm used in structured light systems was three step phase shifting. It reconstructs the image after obtaining the phase difference, but this method is easy to cause deformation in the position of a large angle of view. So this study combine binary coding and phase shifting method. Binary coding located the fringes on the sample accurately, find out the position corresponding to the reference plane, and then clearly know the angle of view of the sample corresponding to the camera, and use triangulation to correct the aberrations at this angle of view. Because the relative angle of view between the camera and the sample can be calculated using the previously recorded binary coded fringes, three-dimensional information can be reconstructed without the need for information on the reference plane. Finally, the combination of the phase shifting method is to further obtain more detailed phase information after obtaining the position of the fringe to obtain more complete three-dimensional information.
In this study, tablets, balls and face masks are used as samples after the system architecture is completed. The plate is used as a sample to test the accuracy of the absolute height of the reconstruction. Placing the plate within 20 cm to 40 cm from the reference plane, the reconstruction error is within 0.5 cm. Using table tennis and contact balls as known samples, test the accuracy of reconstructing simple geometric surfaces. Find the radius with circular fitting and spherical fitting, and also place the table tennis and contact ball within 20 cm to 40 cm from the reference plane. The errors of table tennis and contact ball fits within 0.12 cm and 0.1 cm. Eventually, the face mask proves that it can reconstruct three-dimensional information of more complex structures.
關鍵字(中) ★ 結構光
★ 二進制編碼
★ 相位位移法
★ 三維重建
★ 高景深
關鍵字(英) ★ structured light
★ Binary Code
★ Phase Shift Profilometry
★ 3D Reconstruction
★ Extended Measurement Depth
論文目次 目錄
中文摘要 I
Abstract II
目錄 V
圖目錄 VI
表目錄 IX
第一章 緒論 1
1-1 研究動機與目的 1
1-2 文獻回顧與探討 1
1-3 結構光輪廓測量法 4
1-4 論文架構 9
第二章 實驗原理與分析法 10
2-1 結構光技術原理 10
第三章 系統設計與建構 17
3-1 系統設計 17
3-2 系統架構 18
3-3 條紋編碼設計 23
3-4 演算法流程 29
第四章 實驗結果與分析 33
4-1 系統誤差分析 33
4-2 人臉重建 43
第五章 結論 45
參考文獻 46
參考文獻 1. Q. Zhou, Z. Tan, and C.C. Yang, “Theoretical limit evaluation of ranging accuracy and power for LiDAR systems in autonomous cars,’’ Optical Engineering 57(9), 1–8 (2018).
2. F. Remondino, “3-D reconstruction of static human body shape from image sequence,” Computer Vision and Image Understanding 93(1), 65-85 (2004).
3. G. Sansoni, M. Trebeschi, and F. Docchio, ‘‘State-of-The-Art and Applications of 3D Imaging Sensors in Industry, Cultural Heritage, Medicine, and Criminal Investigation,” Sensors 9(1), 568–601 (2009).
4. Z. Zhang, “Microsoft Kinect Sensor and Its Effect,” IEEE MultiMedia 19(2), 4-10 (2012).
5. S. Zhang, “High-speed 3D shape measurement with structured light methods: A review,” Optics and Lasers in Engineering 106, 119–131 (2018).
6. F. Chen, G. Brown, and M. Song, “Overview of three dimensional shape measurement using optical methods,” Optical Engineering 39(1), 10–22 (2000).
7. R. J. Hocken and P. H. Pereira, Coordinate measuring machines and systems (CRC Press, 2016).
8. S. Van der Jeught, and J. J. J. Dirckxa, “Real-time structured light profilometry: a review,” Optics and Lasers in Engineering 87, 18–31 (2016).
9. J. Geng, ”Structured-light 3D surface imaging: a tutorial,” Advances in Optics and Photonics 3(2), 128-160 (2011).
10. file:///C:/Users/user/Downloads/Time-of-flight%20(ToF)%20measurement%20using%20pulse%20lasers.pdf
11. M. Hansard, S. Lee, O. Choi, and R. Horaud, Time-of-Flight Cameras: Principles, Methods, and Applications (Springer, 2012)
12. H. Sarbolandi, M. Plank, and A. Kolb, “Pulse Based Time-of-Flight Range Sensing,” Sensors 18(6), 1679 (2018).
13. M. Z. Brown, D. Burschka and G.D. Hager, “Advances in computational stereo,” IEEE Transactions on Pattern Analysis and Machine Intelligence 25(8), 993–1008 (2003).
14. A. O’Riordan, T. Newe, G. Dooly, and D. Toal, “Stereo Vision Sensing: Review of existing systems,” in 12th International Conference on Sensing Technology (2018).
15. https://www.mvtec.com/services-solutions/technologies/3d-vision/stereo-vision/
16. D. S. Pankaj, R. R. Nidamanuri, and P. B. Prasad, “3-D Imaging Techniques and Review of Products,” in Proceedings of International Conference on Innovations in Computer Science and Engineering (2013).
17. V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Applied Optics 23(18), 3105–3108 (1984).
18. P. S. Huang and S. Zhang, “Fast three-step phase-shifting algorithm,” Applied Optics 45(21), 5086–5091 (2006).
19. P. S. Huang, S. Zhang, and Fu-Pen Chiang, “Trapezoidal phase-shifting method for three dimensional shape measurement,” Optical Engineering 44(12), 123601 (2005).
20. S. Zhang, “Recent progresses on real-time 3D shape measurement using digital fringe projection techniques,” Optics and Lasers in Engineering 48(2), 149-158 (2010).
21. S. Zhang, D. V. D. Weide, and J. Oliver, “Superfast phase-shifting method for 3-D shape measurement,” Optics Express 18(9), 9684-9689 (2010).
22. C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin , and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Optics and Lasers in Engineering 109, 23–59 (2018).
23. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shapes,” Applied Optics 22(24), 3977–3982 (1983).
24. X. Y. Su and W. J. Chen, “Fourier transform profilometry: a review,” Optics and Lasers in Engineering 35(5), 263-284(2001).
25. S. Feng, Q. Chen, C. Zuo, J. Sun ,T. Tao, and Y. Hu, “A carrier removal technique for Fourier transform profilometry based on principal component analysis,” Optics and Lasers in Engineering 74, 80–86 (2015).
26. M. Takeda, “Fourier fringe analysis and its application to metrology of extreme physical phenomena: a review [Invited],” Applied Optics 52(1), 20-29 (2013).
27. S. Zhang, “Absolute phase retrieval methods for digital fringe projection profilometry: a review,” Optics and Lasers in Engineering 107, 28–37 (2018).
28. 徐維懋, “鏡像輔助斷層掃描相位顯微鏡.” (國立中央大學, 2014)
29. Z. J. Geng, “Rainbow three-dimensional camera: new concept of high-speed three-dimensional vision systems,” Optical Engineering 35(2), 376-383 (1996).
30. J. L. Posdamer and M. D. Altschuler, “Surface measurement by space encoded projected beam systems,” Computer Graphics and Image Processing 18(1), 1–17 (1982).
31. I. Ishii , K. Yamamoto ,K. Doi , and T. Tsuji, ”High-speed D Image Acquisition Using Coded Structured Light Projection,” in IEEE/RSJ International Conference on Intelligent Robots and Systems, 925-930 (2007).
32. O. Hall-Holt and S. Rusinkiewicz, “Stripe Boundary Codes for Real-Time Structured-Light Range Scanning of Moving Objects,” in Proceedings Eighth IEEE International Conference on Computer Vision, 1-8 (2001).
33. W. Lohry and S. Zhang, “3D shape measurement with 2D area modulated binary patterns,” Optics and Lasers in Engineering 50(7), 917–21 (2012).
34. S. Zhang, “Flexible 3D shape measurement using projector defocusing: extended measurement range,” Optics Letters 35(7), 934-936 (2010).
35. M. Wang, “Hybrid Single and Dual Pattern Structured Light Illumination,” (2015)
36. B. Li, Z. Liu, and S. Zhang, “Motion-induced error reduction by combining Fourier transform profilometry with phase-shifting profilometry,” Optics Express 24(20), 23289-23303 (2016).
37. H. Kawasaki, Y. Horita, H. Morinaga, Y. Matugano, S. Ono, M. Kimura, and Y. Takane, “Structured light with coded aperture for wide range 3D measurement,” in 19th IEEE International Conference on Image Processing, 2777–2780 (2012).
38. Y. Xiao, G. Wang, X. Hu, C. Shi, L. Meng, and H. Yang, “Guided, Fusion-Based, Large Depth-of-field 3D Imaging Using a Focal Stack,” Sensors 19(22), 4845 (2019).
39. X. Feng and L. Gao, “Robust structured-light depth mapping via recursive decomposition of binary codes,” Optical Engineering, 58(6), 060501 (2019).
40. https://www.eprice.com.tw/mobile/intro/c01-p5239-htc-one-m9-plus/
41. https://www.jp-uk.co.uk/projectors/education-business-projectors/vivitek-qumi-q6-bk-projector.html
指導教授 陳思妤(Chen, Szu-Yu) 審核日期 2020-1-16
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明