博碩士論文 110226081 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:63 、訪客IP:52.14.126.74
姓名 王冠紘(Kuan-Hung Wang)  查詢紙本館藏   畢業系所 光電科學與工程學系
論文名稱 可見光大視角高解析度之望遠偵蒐光學系統
相關論文
★ 白光LED於住宅照明之設計與應用★ 超廣角車用鏡頭設計
★ 適用於色序式微型投影機之微透鏡陣列積分器光學系統研製★ 發光二極體色溫控制技術及其於色序式微型投影機之應用
★ 光學變焦之軌跡優化控制★ LED光源暨LED與太陽光混和照明於室內照明之模擬與分析
★ 利用光展量概念之微型投影機光學設計方法與實作★ 光學顯微鏡之鏡頭設計
★ 手機上隱藏式指紋辨識設計★ DLP微型投影系統之光路設計
★ 高效率藍光碟片讀取頭★ 模組化雙波長光學讀寫頭的設計與光學讀寫頭應用在角度量測的研究
★ 數位相機之鏡頭設計★ 單光電偵測器之複合式光學讀寫頭
★ 三百萬畫素二點七五倍光學變焦手機鏡頭設計★ 稜鏡玻璃選取對色差的影響與校正
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 (2028-7-24以後開放)
摘要(中) 本論文望遠偵蒐光學系統由一顆單中心物鏡與2132個相同目鏡與2132個相同數位攝影機鏡頭所組成,設計方法為使用光學軟體CODE V分別將單中心物鏡與目鏡與數位攝影機鏡頭單獨設計,再將其一一拼接而成,首先將單中心物鏡與目鏡拼接成Kepler望遠鏡系統,再將此Kepler望遠鏡系統與數位攝影機鏡頭拼接,完成單一組望遠偵蒐光學系統設計。望遠偵蒐光學系統是由於一個較大的單中心球形物鏡和一組完全相同的目鏡與數位攝影機鏡頭組成,因此稱目鏡與數位攝影機鏡頭兩者結合為微型攝影機鏡頭,所以本文望遠偵蒐光學系統共有一顆單中心物鏡與2132組微型攝影機鏡頭。
可見光望遠偵蒐光學系統要求對於距離125公里目標大小為4.69 m × 21.2 m的J-20戰鬥機進行探測,偵測波長為可見光400 nm至700 nm,全視角為水平120° × 垂直72°,共2132台數位攝影機鏡頭,總畫素307億。選用Onsemi 公司產品型號MT9F002之CMOS感測器畫素數目長與寬為4384 × 3288,單一畫素大小為1.4 μm×1.4 μm,CMOS感測器畫素為1400萬,依據Johnson Criteria在準確性95%情況下距離125公里J-20戰鬥機需要2 line pairs才能探測到目標,推導出光學系統總焦距為149.25 mm,每一組望遠偵蒐光學系統之全視角為水平2.352° × 垂直1.764°,可得在CMOS感測器上每一畫素之角解析度(IFOV, Instantaneous Field Of View)為9.363 μrad,約為0.032′(arc minute)。探測目標J-20戰鬥機於125 km處為169.6 μrad,IFOV越小能看得越細,可以探測到J-20戰鬥機,計算出J-20戰鬥機在CMOS感測上約佔18.11 pixel。已知人眼為1 arc minute,可見光望遠偵蒐光學系統解析度為人眼31倍。
在對比微型攝影機鏡頭排列方式後,計算出六角排列可以比正方形排列多出1.1547倍空間利用效率,因此將六角排列作為微型攝影機鏡頭排列方式可以有效的縮短行與行的間距,使球面排列結果更加緊湊,有效節省空間。
為了模擬在真實環境下可見光望遠偵蒐光學系統轉換效率使用光學軟體LightTools建立模型,使用照度計量測太陽照度最大值為166100 lux等於166100 lm⁄m^2 ,單中心球形物鏡第一面直徑為72.447 mm,可知入射表面積為1.648 ×10^(-2) m^2,入射光源總光通量為2738.782 lm,實際在感測器接收到的能量為998.764 lm,平均差為0.232,可計算出轉換效率為36.467 %。
將本論文設計結果與同樣以單中心多尺度設計系統設計的AWARE系列進行比較,使用相同CMOS感測器,差別在於由於本論文主要目的是探測距離125 km的J-20戰鬥機,整體光學系統焦距較長,以至於單一個CMOS感測器所能覆蓋的角度很小,IFOV能解析的畫面更細,由於探測要包含天空和海上,因此整體光學系統全視角要大,入射角度小導致整體camera數量多,總畫素也隨之增加,整體優於以往所AWARE設計出的單中心多尺度光學系統。
摘要(英) This study presents a design for a long-range telescope reconnaissance optical system consisting of one monocentric objective lens and 2,132 sets of identical eyepieces and cameras. The design approach involves using CODE V optical design software to individually design the monocentric objective lens, eyepieces, and cameras, which are subsequently assembled together. Initially, the monocentric objective lens and the eyepieces are combined to form a Kepler telescope system, which is then integrated with the cameras to complete a set of the long-range telescope reconnaissance optical system. The system comprises a larger monocentric spherical objective lens and a set of same eyepieces combined with cameras, hence referred to as the “microcameras.” Therefore, the proposed long-range telescope reconnaissance optical system includes one monocentric objective lens and 2132 sets of microcameras.
The long-range visible light telescope reconnaissance optical system requires the detection of J-20 fighter aircraft at a distance of 125 kilometers, with a target size of 4.69 m × 21.2 m. The detection wavelength ranges from 400 nm to 700 nm, with a field of view of 120° horizontally and 72° vertically, totaling 2132 cameras. The total pixel count is 30.7 gigapixels. The selected CMOS sensor is the MT9F002 from Onsemi, with an active pixel array of 4,384 H × 3,288 V and a pixel size of 1.4 μm × 1.4 μm. The CMOS sensor has a total of 14 MP resolution. Based on the Johnson Criteria, the J-20 fighter aircraft at a distance of 125 kilometers requires 2 line pairs for detection with an accuracy of 95%. The derived total focal length of the optical system is 149.25 mm. Each set of the long-range telescope reconnaissance optical system has a field of view of 2.352° horizontally and 1.764° vertically. The angular resolution per pixel on the CMOS sensor, known as the instantaneous field of view (IFOV), is 9.363 μrad (0.032 arc minutes). The J-20 fighter aircraft at 125 kilometers occupies approximately 169.6 μrad on the CMOS sensor, equivalent to around 18.11 pixels. Compared to the human eye’s resolution of 1 arc minute, the visible light long-range telescope reconnaissance optical system offers a resolution that is 31 times better than what the human eye can see.
After conducting a comparison of various microcamera layouts, it was determined that a hexagonal arrangement offered 1.1547 times better space utilization compared to a square arrangement. Therefore, the hexagonal arrangement was adopted to effectively optimize space efficiency by reducing the spacing between rows and creating a more compact spherical arrangement.
To simulate the conversion efficiency of the long-range visible light telescope reconnaissance optical system in real-world scenarios, a model was created using the optical software LightTools. The maximum solar illuminance was measured using an illuminometer yielding a value of 166,100 lux or 166,100 lm/m^2. With a diameter of 72.447 mm for the first surface of the monocentric spherical objective lens, the incident surface area was calculated to be 1.648 × 10^(-2) m^2. The total luminous flux was 2,738.782 lm, while the actual energy received by the sensor was 998.764 lm, resulting in an average difference of 0.232. Based on these calculations, the conversion efficiency was calculated to be 36.467%.
The design results of this thesis are compared with the AWARE-series gigapixel cameras, which also employs a monocentric multi-scale optical system but with a different purpose. Both systems use the same CMOS sensor, but the main objective of this study is to detect J-20 fighter aircraft at a distance of 125 kilometers. The key difference lies in the overall optical system’s longer focal length in this thesis, resulting in a smaller field of view that allows for finer resolution of the IFOV on a single CMOS sensor. As the detection task includes both the sky and the sea areas, a wider field of view is required for the overall optical system, leading to smaller incident angles and consequently more cameras. This, in turn, increases the total number of pixels in the system. Overall, the optical system designed in this study outperforms the previously designed monocentric multi-scale optical system in the AWARE series.
關鍵字(中) ★ 可見光望遠偵蒐光學系統
★ 單中心物鏡
★ 目鏡
★ 攝影機鏡頭
★ 微型攝影機鏡頭
★ J-20戰鬥機
★ 感測器
★ 瞬時視場
★ 六角排列
關鍵字(英) ★ long-range visible light telescope reconnaissance optical system
★ monocentric objective lens
★ eyepiece lens
★ camera
★ microcamera
★ J-20 fighter aircraft
★ sensor
★ Instantaneous Field Of View
★ hexagonal packing
論文目次 摘要 i
Abstract iv
誌謝 vii
目錄 ix
圖目錄 xiii
表目錄 xxii
第一章 緒論 1
1-1 研究動機 1
1-2 文獻回顧 3
1-2-1 單相機掃描成像 3
1-2-2 感測器拼接成像 5
1-2-3 多相機凝視成像技術 14
1-2-4 多尺度系統研究現狀 16
1-3 論文架構 28
第二章 單中心多尺度光學系統成像原理 30
2-1 光學參數定義 30
2-2 像差簡介 34
2-3 單中心多尺度系統成像理論 37
2-3-1 單中心多尺度成像原理 38
2-3-2 單中心多尺度系統結構選型 40
2-3-3 望遠鏡系統架構 42
2-3-4 單中心多尺度光學系統結構光圈特性分析 46
2-3-5 單中心多尺度系統計算 52
2-4 目鏡陣列與數位攝影機鏡頭陣列排列設計 53
2-5 微型攝影機鏡頭的視場重疊 56
第三章 可見光望遠偵蒐光學系統設計 60
3-1 可見光望遠偵蒐光學系統初階規格 62
3-1-1 感測器規格 62
3-1-2 系統總焦距 63
3-1-3 入射光線視角與IFOV 66
3-1-4 單中心物鏡初階規格 68
3-1-5 目鏡初階規格 68
3-1-6 Kepler望遠鏡初階規格 69
3-1-7 數位攝影機鏡頭初階規格 70
3-1-8 可見光望遠偵蒐光學系統規格 70
3-1-9 微型攝影機鏡頭數目 71
3-1-10 RMS OPD與Strehl ratio 71
3-1-11 MTF 74
3-1-12 |SMTF-TMTF| 75
3-1-13 橫向色差 75
3-1-14 相對照度 75
3-1-15 畸變 76
3-1-16 設計目標 78
3-2 單中心物鏡設計 79
3-2-1 單中心物鏡設計原理 80
3-2-2 單中心物鏡設計過程 95
3-3 目鏡設計原理 103
3-3-1 目鏡設計過程 104
3-4 望遠鏡設計 107
3-5 數位攝影機鏡頭設計 112
3-5-1 數位攝影機鏡頭設計原理 112
3-5-2 數位攝影機鏡頭設計過程 113
3-6 可見光望遠偵蒐光學系統設計 120
3-6-1 MTF 124
3-6-2 畸變 125
3-6-3 橫向色差 126
3-6-4 相對照度 127
3-6-5 設計結果整理 128
第四章 公差分析 129
4-1 中心公差(Centered tolerance) 130
4-2 偏心公差(Decentered tolerance) 137
4-3 公差分析過程 142
4-4 公差分析結果 143
第五章 LightTools 光學軟體模擬光學系統 148
5-1 可見光望遠偵蒐光學系統模型建立 148
5-2 可見光望遠偵蒐光學系統模擬 155
第六章 結論與未來展望 164
6-1 結論 164
6-2 未來展望 166
參考文獻 168
參考文獻 [1]環球網,“殲20性能數據首次披露:最大飛行速度2馬赫”。2022年8月25日,取自http://www.xinhuanet.com/mil/2021-09/28/c_1211385667.htm
[2]Lockheed Martin, “PAC-3 MSE,” retrieved 11 March 2023, from https://www.lockheedmartin.com/en-us/products/pac-3-advanced-air-defense-missile.html
[3]R. Sargent, C. Bartley, P. Dille, J. Keller, I. Nourbakhsh, and R. LeGrand, “Timelapse GigaPan: Capturing, sharing, and exploring timelapse gigapixel imagery,” presented at the Fine International Conference on Gigapixel Imaging for Science, Pittsburgh, Pennsylvania (2010).
[4]1474 megapixel panoramic photo of President Obama′s inauguration. Retrieved 3 July 2022, from http://gigapan.com/gigapans/15374/
[5]eoPortal Org, “Worldview-1,” retrieved 16 July 2022, from https://www.eoportal.org/satellite-missions/worldview-1#eop-quick-facts-section
[6]D. Ebbets, P. Atcheson, C. Stewart, P. Spuhler, J. Van Cleve, and S. Bryson "Optical performance of the 100-sq deg field-of-view telescope for NASA′s Kepler exoplanet mission", Proc. SPIE 8146, UV/Optical/IR Space Telescopes and Instruments: Innovative Technologies and Concepts V, 81460G (14 September 2011).
[7]D. Ebbets, V. Argabright, J. Stober, J. VanCleve, D. Caldwell, J. Kolodziejczak, “In-flight photometric performance of the 96Mpx focal plane array assembly for NASA′s Kepler exoplanet mission,” Proc. SPIE 8146, UV/Optical/IR Space Telescopes and Instruments: Innovative Technologies and Concepts V, 81460H (14 September 2011).
[8]Anouk Laborie, Robert Davancens, Pierre Pouny, Cyril Vétel, François Chassat, Philippe Charvet, Philippe Garé, Giuseppe Sarri, “The Gaia focal plane,” Proc. SPIE 6690, Focal Plane Arrays for Space Telescopes III, 66900A (12 September 2007).
[9]Ralf Kohley, Philippe Garé, Cyril Vétel, Denis Marchais, and François Chassat, “Gaia′s FPA: sampling the sky in silicon,” Proc. SPIE 8442, Space Telescopes and Instrumentation 2012: Optical, Infrared, and Millimeter Wave, 84421P (21 September 2012).
[10]Ivezić, Ž., Connolly, A.J., & Jurić, M. Everything we’d like to do with LSST data, but we don’t know (yet) how. Proceedings of the International Astronomical Union, 12, 93 – 102 (2016).
[11]Ivezić Ž, Kahn S M, Tyson J A, et al. LSST: From science drivers to reference design and anticipated data products. The Astrophysical Journal, 873: 111 (2019).
[12]Douglas Neill, George Angeli, Chuck Claver, Ed Hileman, Joseph DeVries, Jacques Sebag, and Bo Xin “Overview of the LSST active optics system,” Proc. SPIE 9150, Modeling, Systems Engineering, and Project Management for Astronomy VI, 91500G (4 August 2014).
[13]S. M. Kahn, N. Kurita, K. Gilmore, M. Nordby, P. O′Connor, R. Schindler, J. Oliver, R. Van Berg, S. Olivier, V. Riot, P. Antilogus, T. Schalk, M. Huffer, G. Bowden, J. Singal, and M. Foss “Design and development of the 3.2 gigapixel camera for the Large Synoptic Survey Telescope,” Proc. SPIE 7735, Ground-based and Airborne Instrumentation for Astronomy III, 77350J (10 August 2010).
[14]Mark Clampin, “Status of the James Webb Space Telescope Observatory,” Proc. SPIE 8442, Space Telescopes and Instrumentation 2012: Optical, Infrared, and Millimeter Wave, 84422A (21 September 2012).
[15]J. G. Hagopian et al., “Optical Alignment and Test of the James Webb Space Telescope Integrated Science Instrument Module,” 2007 IEEE Aerospace Conference, pp. 1-13(3-10 March 2007).
[16]J. Scott Knight, D. Scott Acton, Paul Lightsey, Adam Contos, Allison Barto, “Observatory alignment of the James Webb Space Telescope,” Proc. SPIE 8442, Space Telescopes and Instrumentation 2012: Optical, Infrared, and Millimeter Wave, 84422C (21 September 2012).
[17]B. Leininger, J. Edwards, J. Antoniades, D. Chester, D. Haas, E. Liu, M. Stevens, C. Gershfield, M. Braun, and J. D. Targove, “Autonomous real-time ground ubiquitous surveillance-imaging system (ARGUS-IS),” Proc. SPIE 6981, 69810H (2008).
[18]Nicholas M. Law, Octavi Fors, Jeffrey Ratzloff, Henry Corbett, Daniel del Ser, and Philip Wulfken, “The Evryscope: design and performance of the first full-sky gigapixel-scale telescope,” Proc. SPIE 9906, Ground-based and Airborne Telescopes VI, 99061M (8 August 2016).
[19]D. J. Brady and N. Hagen, “Multiscale lens design,” Opt. Express 17, 10659–10674 (2009).
[20]O. S. Cossairt, D. Miau, and S. K. Nayar, “Gigapixel computational imaging,” in Computational Photography (ICCP), 2011 IEEE International Conference on, (IEEE, 2011), 1–8.
[21]David S. Kittle, Daniel L. Marks, and David J. Brady, “Automated calibration and optical testing of the AWARE-2 gigapixel multiscale camera,” Proc. SPIE 8660, Digital Photography IX, 866006 (2013).
[22]D. L. Marks, H. S. Son, J. Kim, and D. J. Brady, “Engineering a gigapixel monocentric multiscale camera,” Opt. Eng. 51, 083202 (2012).
[23]H. S. Son, A. Johnson, R. A. Stack, J. M. Shaw, P. McLaughlin, D. L. Marks, D. J. Brady, and J. Kim, “Optomechanical design of multiscale gigapixel digital camera,” Appl. Opt. 52(8), 1541–1549 (2013).
[24]E. J. Tremblay, D. L. Marks, D. J. Brady, and J. E. Ford, “Design and scaling of monocentric multiscale imagers,” Appl. Opt. 51, 4691–4702 (2012).
[25]D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, "Multiscale gigapixel photography," Nature 486, 386-389 (2012).
[26]D. L. Marks, P. R. Llull, Z. Phillips, J. G. Anderson, S. D. Feller, E. M. Vera, H. S. Son, S.-H. Youn, J. Kim, M. E. Gehm, D. J. Brady, J. M. Nichols, K. P. Judd, M. D. Duncan, J. R. Waterman, R. A. Stack, A. Johnson, R. Tennill, and C. C. Olson, “Characterization of the AWARE 10 two-gigapixel wide-field-of-view visible imager,” Appl. Opt. 53, C54-C63 (2014).
[27]Patrick Llull, Lauren Bange, Zachary Phillips, Kyle Davis, Daniel L. Marks, and David J. Brady, “Characterization of the AWARE 40 wide-field-of-view visible imager,” Optica 2, 1086-1089 (2015).
[28]D. L. Marks, H. S. Son, Z. F. Phillips, S. D. Feller, J. Kim, and D. J. Brady, “Multiscale Camera Objective with sub 2 Arcsec Resolution, 36 degree Field-of-View,” in Classical Optics 2014, OSA Technical Digest (online) (Optica Publishing Group, 2014), paper CTh1C.3.
[29]Wikipedia, Convex lens. Retrieved 31 March 2023, from https://en.wikipedia.org/wiki/Image:Large_convex_lens.jpg
[30]Wikipedia, Concave lens. Retrieved 31 March 2023, from https://commons.wikimedia.org/wiki/File:Concave_lens.jpg
[31]W. J. Smith, Modern Optical Engineering: the Design of Optical Systems, 4th ed., Optical and Electro-Optical Engineering Series, Chap. 3, (McGraw-Hill, 2008).
[32]H. S. Son, A. Johnson, R. A. Stack, J. M. Shaw, P. McLaughlin, D. L. Marks, D. J. Brady, and J. Kim, “Optomechanical design of multiscale gigapixel digital camera,” Appl. Opt. 52(8), 1541–1549 (2013).
[33]Eric J. Tremblay, Daniel L. Marks, David J. Brady, Joseph E. Ford, “Design and scaling of monocentric multiscale imagers,” Appl. Opt. 51, 4691-4702 (2012).
[34]D. L. Marks, H. S. Son, J. Kim, and D. J. Brady, “Engineering a gigapixel monocentric multiscale camera,” Opt. Eng. 51, 083202 (2012).
[35]Eric J. Tremblay, Daniel L. Marks, David J. Brady, Joseph E. Ford, “Design and scaling of monocentric multiscale imagers,” Appl. Opt. 51, 4691-4702 (2012).
[36]E. J. Tremblay, D. L. Marks, D. J. Brady, and J. E. Ford, “Design and scaling of monocentric multiscale imagers,” Appl. Opt. 51, 4691–4702 (2012).
[37]B. W. Clare and D. L. Kepert, “The closest packing of equal circles on a sphere,” Proc. R. Soc. A 405, 329–344 (1986).
[38]Hui S. Son, Daniel L. Marks, Seo H. Youn, David J. Brady, and Jungsang Kim, “Alignment and assembly strategies for AWARE-10 gigapixel-scale cameras,” Proc. SPIE 8836, Optomechanical Engineering 2013, 88360B (2013).
[39]D. L. Marks, H. S. Son, J. Kim, and D. J. Brady, “Engineering a gigapixel monocentric multiscale camera,” Opt. Eng. 51, 083202 (2012).
[40]onsemi, “MT9F002,” retrieved 17 February 2022, from https://www.mouser.com/datasheet/2/308/MT9F002-D-606237.pdf
[41]D. Peric, ‘‘Thermal imager range: predictions, expectations, and reality,’’ Sensors Vol. 19 (2019).
[42]A. Daniels, “Filed guide to infrared systems,” SPIE Field Guides Volume FG09, SPIE Press, 2007.
[43]E. Hecht, Optics, Global Edition, 5e. Pearson Education. Chap. 6, (2017).
[44]W. J. Smith, Modern Optical Engineering: the Design of Optical Systems, 4th ed., Optical and Electro-Optical Engineering Series, Chap. 11, (McGraw-Hill, 2008).
[45]W. J. Smith, Modern Optical Engineering: the Design of Optical Systems, 4th ed., Optical and Electro-Optical Engineering Series, Chap. 3, (McGraw-Hill, 2008).
[46]W. J. Smith, Modern Optical Engineering: the Design of Optical Systems, 4th ed., Optical and Electro-Optical Engineering Series, Chap. 6, (McGraw-Hill, 2008).
[47]E. Hecht, Optics, Global Edition, 5e. Pearson Education. Chap. 6, (2017).
[48]黃前銘,四百萬畫素DLP大口徑投影機鏡頭設計與溫度、電視畸變、橫向色差、相對照度之探討,碩士論文,國立中央大學光電科學與工程學系,中華民國107年。
[49]Oliver S. Cossairt, Daniel Miau, Shree K. Nayar, “Scaling law for computational imaging using spherical optics,” J. Opt. Soc. Am. A 28, 2540-2553 (2011).
[50]Marks, D.L.; Son, H.S.; Kim, J.; Brady, D.J. Engineering a gigapixel monocentric multiscale camera. Opt. Eng. 2012, 51, 083202.
[51]Geunyoung Yoon, Ph.D., “Aberration theory”, retrieved 23 April 2023, from http://cfao.ucolick.org/pubs/presentations/eyedesign/05_aberrations_GY.pdf
[52]国広 浄保, "変倍接眼レンズ", 日本国特許庁, 特公昭58-27481, 1983.
[53]Wikipedia, Horace Lee′s asymmetric double-Gauss lens design. Retrieved 11 June 2023, from https://en.wikipedia.org/wiki/File:Lee_GB157040_(OPIC,_1920).svg#file
[54]池森 敬二, "リア・アタツチメントレンズ", 日本国特許庁, 特公昭61-13206, 1986.
[55]孫文信,2022 CODE V光學軟體實作(上冊),國立中央大學光電科學與工程學系,2022年。
[56]Daniel Malacara, Optical Shop Testing, 2nd ed. (Wiley, 1992).
[57]CODE V “Tolerancing Reference Manual” Version 2022.03 March 2022.
[58]Schott, “spherical-and-achromatic-lenses-product,” retrieved 4 May 2023, from https://mss-p-009-delivery.stylelabs.cloud/api/public/content/0ba1575fe22c4f39b4e3d3dddcf62f6d?v=b70f4a52&download=true
[59]Schott, “tie-04 test report for delivery lots of optical glass,” retrieved 4 May 2023, from https://mss-p-009-delivery.stylelabs.cloud/api/public/content/c6dabd8d3d65451a86902f326e22d0bc?v=78a28424&download=true
[60]楊家逢,模組化光學讀寫頭的設計與光學讀寫頭應用在角度量測的研究,碩士論文,國立中央大學光電科學與工程學系,中華民國96年。
[61]王士安,「有關玻璃鏡片特性探討與鏡片製造組裝公差對其鏡頭設計與成像品質分析」,碩士論文,國立中央大學光電科學與工程學系,中華民國112年。
指導教授 孫文信(Wen-Shing Sun) 審核日期 2023-7-24
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明