博碩士論文 966203004 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:7 、訪客IP:18.188.61.223
姓名 陳少昂(Shao-ang Chen)  查詢紙本館藏   畢業系所 太空科學研究所
論文名稱 新的影像融合演算法應用於多光譜遙測影像
(A Novel Image Fusion Algorithm for Multispectral Remotely Sensed Image)
相關論文
★ 利用高光譜影像作異常物偵測★ 利用電腦自動化對數值高程模型作線形偵測
★ 利用多光譜影像的光譜與空間資訊結合數學型態學進行海洋油汙偵測★ 利用遙測影像自動萃取校正點
★ 利用固定式攝影機即時偵測土石流★ 藉由電腦視覺自動偵測土石流
★ 利用多層模型於全波形光達分析樹冠結構★ 利用MHE對多光譜影像輻射校正並 應用於土石流變遷偵測
★ 單發多收合成孔徑雷達模擬與實驗★ 福爾摩沙五號衛星影像壓縮之實現
★ 電磁散射模型於粗糙表面之研究★ 超寬頻Ka波段於樹之散射量測及研究分析
★ 立體影像對自動特徵點提取進行三維重建★ Comparison of Change Detection Methods Based on the Spatial Chaotic Model for Synthetic Aperture Radar Imagery
★ 利用遙測影像偵測碟型天線之方向★ 超解析成像法應用於光學衛星影像之比較
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 光學衛星影像一般包括全色態影像與多光譜影像,其中全色態影像的空間解析度較高,而多光譜影像的空間解析度較低。但多光譜影像擁有較多的波段數量,可以用以描述地面物質的反射光譜,而全色態影像只有一個波段,因其頻寬較大,故可以提供較高的空間解析度。
影像融合之作用則是在於融合同一區域之多光譜影像以及全色態影像,產生出高空間解析度的多光譜影像,藉以提升多光譜影像在空間上的細節資訊。而近年來影像融合演算法被廣泛的提出,不同的演算法在融合影像上的品質有很大的差異。本研究則是提出一個新的影像融合演算法,並利用SPOT-5 和FORMOSAT-2影像進行融合實驗,再和其他演算法比較其融合影像之品質。此外本研究也針對不同的融合影像在物質分類上進行分析,比較不同演算法在分類上與原始影像分類的差異。最後針對融合影像中易出現的區塊效應 (Blocking effect) 提出解決方法。
實驗結果顯示,本研究所提出之影像融合法擁有自動化以及高效率之優點,而其融合影像不論是在視覺上以及影像品質上都有良好的表現。
摘要(英) Multispectral sensors on the satellites collect radiance information from the ground with several frequency bands to form a co-registered image cube. It provides useful information for spectral analysis, but it usually has low spatial resolution. On the other hand, remote sensing satellites also collect panchromatic images with wider bandwidth but higher spatial resolution.
The image fusion technique can combine the panchromatic and the multispectral images into a high spatial resolution multispectral image, which can present more spatial detail of the ground. In recent years, many image fusion methods have been proposed, including fast Intensity-Hue-Situation method, Choi’s method, principle component analysis method and wavelet-based methods. However the fused images from mentioned methods have faced some problems, such as color distortion and contrast variance. In this study, we propose a new image fusion method and conduct comparison with other methods by various image quality indexes. We also apply the endmember classifier to each fused image and calculate the difference between the fused results and the original one. Finally, a deblocking method is provided to reduce the blocking effect in the fused image. In the experiments, the SPOT-5 and the FORMOSAT-2 image scenes are used .to demonstrate the effectiveness of the proposed algorithm. It provides a procedural for automatic image fusion and yields good quality and performance.
關鍵字(中) ★ 影像融合
★ 多光譜影像
★ 區塊效應
★ 分類
關鍵字(英) ★ Classification
★ Blocking effect
★ Multispectral image
★ Image fusion
論文目次 摘要 ............................................................................................................... ii
Abstract ............................................................................................................. iii
Contents ............................................................................................................. iv
List of Figures ..................................................................................................... vi
List of Tables ....................................................................................................... ix
Chapter 1 Introduction ..................................................................................... 1
1.1 General background and motivation ...................................................... 1
1.2 Literature review ................................................................................... 2
1.3 Purpose of research ................................................................................ 3
1.4 Overview ............................................................................................... 4
Chapter 2 Review the Image Fusion Methods ............................................... 5
2.1 Intensity-Hue-Saturation image fusion method ...................................... 5
2.1.1 Gonzalez and Woods IHS color model .......................................... 6
2.1.2 The kernel-base IHS color model ................................................. 7
2.2 IHS image fusion method with a tradeoff parameter. ........................... 10
2.3 Wavelet image fusion method. ............................................................. 12
2.3.1 Substitution Method ................................................................... 13
2.3.2 Additive Method ......................................................................... 15
2.4 Principle component analysis image fusion method. ............................ 17
Chapter 3 The Proposed Image Fusion Algorithm ...................................... 19
3.1 Problems .............................................................................................. 19
3.2 The steps of proposed method .............................................................. 23
Chapter 4 Experiment results and discussions ............................................ 25
4.1 Date information .................................................................................. 25
4.2 The method of experiment ................................................................... 26
4.2.1 The source of the ground truth image ......................................... 26
4.2.2 The setting of the propose image fusion method ......................... 26
4.2.3 The setting of the wavelet-based image fusion method ............... 27
4.2.4 Image resample processing ........................................................ 28
4.3 The flowchart of experiment ................................................................ 29
4.4 The visual image fusion results. ........................................................... 30
4.4.1 FORMOSAT-2 image fusion results ............................................ 30
4.4.2 SPOT-5 image fusion results ...................................................... 42
4.5 Image quality analysis. ........................................................................ 54
4.6 Material classification results and analysis........................................... 69
4.6.1 Review the methods of classification .......................................... 70
4.6.2 The classification result .............................................................. 71
4.7 The Deblocking Method of the fused image ........................................ 86
4.8 The computing time ............................................................................. 91
Chapter 5 Conclusion ..................................................................................... 95
References .......................................................................................................... 97
參考文獻 [1] Zhang, Y. and G. Hong, “An IHS and wavelet integrated approach to improve pan-sharpening visual quality of natural colour IKONOS and QuickBird images,” Inf. Fusion, vol. 6, no. 3, pp. 225–234, Sep. 2005.
[2] Haydn, R., G. W. Dalke, and J. Henkel, “Application of the IHS color transform to the processing of multisensor data and image enhancement,” in Proc. Int. Symp. Remote Sens. Arid and Semi-Arid Lands, Cairo, Egypt, pp. 599–616, Jan. 1982.
[3] Kathleen E. and A. D. Philip, “The use of intensity–hue–saturation transformation for producing color shaded relief images,” Photogramm. Eng. Remote Sens., vol. 60, no. 11, pp. 1369–1374, 1994.
[4] Van Genderen J. L. and C. Pohl, “Image fusion: Issues, techniques and applications. Intelligent image fusion,” in Proc. EARSeL Workshop, Strasbourg, France, pp. 18–26, Sep. 1994.
[5] Poul C. and J. L. Van Genderen, “Multisensor image fusion in remote sensing: Concepts, methods and applications,” Int. J. Remote Sens., vol. 19, no. 5, pp. 823–854, 1998.
[6] Tu T. M., S. C. Su , H. C. Shyu, and P. S. Huang, “A new look at IHS like image fusion methods,” Inf. Fusion, vol. 2, no. 3, pp. 177–186, Sep. 2001.
[7] Pellemans A., R. Jordans, and R. Allewijn, “Merging multispectral and panchromatic SPOT images with respect to the radiometric properties of the sensor,” Photogramm. Eng. Remote Sens., vol. 59, no. 1, pp. 81–87, 1993.
[8] Wang Z., D. Ziou, C. Armenakis, D. Li, and Q. Li, “Comparative analysis of image fusion methods,” IEEE Trans. Geosci. Remote Sens., vol. 43, no. 6, pp. 1391–1402, Jun. 2005.
[9] Tu T. M., P. S. Huang, C.-L. Hung, and C.-P. Chang, “A fast intensity–hue–saturation fusion technique with spectral adjustment for IKONOS imagery,” IEEE Geosci. Remote Sens. Lett., vol. 1, no. 4, pp. 309–312, Oct. 2004.
[10] Choi M., “A New Intensity-Hue-Saturation Fusion Approach to Image Fusion with a Tradeoff Parameter”, IEEE Trans. Geosci. Remote Sens., VOL. 44, NO. 6, June 2006.
[11] Yang T. L., “Multi-Source Remote Sensing Imagery Fusion And Color Adjustment”, Dept. of Communication Eng., National Central University, Taoyuan County, Taiwan, 2009.
[12] Liao W.H., ”Image Fusion application of Multi-level Wavelet”, Master dissertation, Dept. of Civil Eng., National Central University, Taoyuan County, Taiwan, 2004
[13] Zhang Y., “A new merging method and its spectral and spatial effects,” Int. J. Remote Sens., vol. 20, no. 10, pp. 2003–2014, Jul. 1999.
[14] Zhou J., D. L. Divco, and J. A. Silander, “A wavelet transform method to merge Landsat TM and SPOT panchromatic data,” Int. J. Remote Sens., vol. 19, no. 4, pp. 743–757, Mar. 1998.
[15] Wilson T. A., S. K. Rogers, and M. Kabrisky, “Perceptual based image fusion for hyperspectral data,” IEEE Trans. Geosci. Remote Sens., vol. 35, no. 4, pp. 1007–1017, Jul. 1997.
[16] Aiazzi B., L. Alparone, S. Baronti, and A. Garzelli, “Context-driven fusion of high spatial and spectral resolution images based on oversampled multiresolution analysis,” IEEE Trans. Geosci. Remote Sens., vol. 40, no. 10, pp. 2300–2312, Oct. 2002.
[17] Starck J. L. and F. Murtagh, “Image restoration with noise suppression using the wavelet transform,” Astron. Astrophys., vol. 288, pp. 342–350, 1994.
[18] Chibani Y. and A. Houacine, “The joint use of IHS transform and redundant wavelet decomposition for fusing multispectral and panchromatic images,” Int. J. Remote Sens., vol. 23, no. 18, pp. 3821–3833, Sep. 2002.
[19] Lillo-Saavedra M. and C. Gonzalo, “Spectral or spatial quality for fused satellite imagery? A trade-off solution using the wavelet á trous algorithm,” Int. J. Remote Sens., vol. 27, no. 7, pp. 1453–1464, Apr. 2006.
[20] Núñez J., X. Otazu, O. Fors, A. Prades,V. Palà, and R. Arbiol, “Multiresolution-based image fusion with additive wavelet decomposion,” IEEE Trans. Geosci. Remote Sens., vol. 37, no. 3, pp. 1204–1211, Mar. 1999.
[21] Núñez J., X. Otazu, O. Fors, and A. Prades, “Fusion and reconstruction of LANDSAT and SPOT images using wavelets,” in Proc. Fusion of Earth Data, Sophia Antipolis, France, pp. 103–108, Jan. 1998.
[22] Gonzalez R. C. and R. E. Woods, “Digital Image Processing”, Prentice Hall, USR, NEW JERSEY, pp. 289-302, 2001.
[23] Daniel, C.H., C.I. Chang, “Fully Constrained Least Squares Linear Spectral Mixture Analysis Method for Material Quantification in Hyperspectral Imagery”, IEEE Transactions on Geosciences and Remote Sensing, 39(3):529-545, 2001
[24] Chang, C.I., Hyperspectral Imaging: Techniques for Spectral Detection and Classification, Springer, 2003
[25] Kalpoma, K.A., J.I. Kudoh, “Image Fusion Processing for IKONOS 1-m Color Imagery”, IEEE Trans. Geosci. Remote Sens., 45(10):3075-3086, Oct. 2007.
指導教授 任玄(Hsuan Ren) 審核日期 2009-7-23
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明