博碩士論文 110522168 完整後設資料紀錄

DC 欄位 語言
DC.contributor資訊工程學系zh_TW
DC.creator陳家偉zh_TW
DC.creatorCHIA-WEI CHENen_US
dc.date.accessioned2024-1-25T07:39:07Z
dc.date.available2024-1-25T07:39:07Z
dc.date.issued2024
dc.identifier.urihttp://ir.lib.ncu.edu.tw:444/thesis/view_etd.asp?URN=110522168
dc.contributor.department資訊工程學系zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract隨著影音媒體需求的不斷增長,超解析度領域的重要性日 益提升。特別是Transformer模型因其卓越的性能而在電腦視 覺方面受到廣泛關注,導致越來越多的研究將其應用於這一領 域。然而,我們發現儘管Transformer通過增加不同機制的注 意力能夠解決學習特徵有限的問題,但是在訓練過程中仍可能 遺失一些紋理和結構。為了盡可能地保留初始特徵和架構,我 們提出了一種方式用以整合Residual Connection、Attention Mechanism和Upscaling Technique。為了驗證我們方法的性能, 我們在五個不同的資料集上進行了多次實驗,並且與現有的先 進超解析度模型進行了比較。實驗結果顯示,我們的方法在性 能上相較於當前領域中最先進的模型有著更佳的表現。zh_TW
dc.description.abstractAs the demand for audio-visual media continues to grow, the significance of the super-resolution field is increasingly recognized. In particular, Transformer models have garnered widespread attention in the realm of computer vision due to their exceptional performance, leading to their growing application in this area. However, we observed that despite the ability of Transformer to address the issue of limited feature learning through various attention mechanisms, some textures and structures may be lost during the learning process. To maximally preserve the initial features and structures, we propose a system, named Integrated Attention Transformer (IAT), that integrates Residual Connection, Attention Mechanism, and Upscaling Technique. To confirm the efficacy of IAT, experiments were conducted on five different datasets, compared with the current advanced super-resolution state-of-the-art (SOTA) model. The results show that the proposed IAT surpasses the current SOTA model.en_US
DC.subject超解析度zh_TW
DC.subjectSuper Resolutionen_US
DC.subjectTransformeren_US
DC.titleChannel Spatial Attention-based Transformer for Image Super-Resolutionen_US
dc.language.isoen_USen_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明