訓練資料的不足是自然語言處理 (Natural Language Processing) 任務中面臨的最大挑戰之一。意圖偵測是一個跨多個領域的經典自然語言處理任務,他屬於對話系統中自然語言理解重要的元件之一,而意圖偵測領域也常常面臨資料不足的問題。以往的研究通過採用逆向翻譯 (Back Translation)、簡單資料擴增(Easy Data Augmentation) 等基於文本空間的資料擴增方法來提升訓練資料量,或者是基於特徵空間方法,像是CVAE、外推、高斯噪音等方法,以不同面向來解決資料量不足的問題。然而,Kumar 等人 (2021) 在他們的文獻之未來展望中提到,透過他們提出的資料擴增技術可以與潛在空間資料擴增相互結合,統一不同面向的資料擴增方法能夠激發出更強力的資料擴增方法亦是新的思路。因此,我們提出了一種同時包含文本空間和特徵空間資料擴增的混合架構。這種混合式結構的目的是增強模型的泛化能力,使能夠廣泛應用在不同領域,亦可增進資料擴增的效率。除此之外,我們在實驗中也會更近一步觀察資料擴增的生成品質以及資料擴增對意圖標籤分類性能的影響。從實驗結果中可以觀察到,與僅應用文本空間資料擴增的設置相比,透過我們提出的混合式架構來整合不同面向的資料擴增方法後,所有資料集的表現均有一致且穩定的提升。因此,結果驗證了我們提出的架構具有強大的泛化能力和有效性。;One of the biggest challenges in Natural Language (NLP) tasks is the scarcity of training data. Intent detection is a classic NLP task that spans multiple domains. Yet, it also encounters data scarcity issues. Previous studies have tackled this issue by employing both text space-based methods, such as back translation and Easy Data Augmentation (EDA) (Wei and Zou, 2019), and feature space-based approaches, including CVAE, extrapolation, and so forth. Nevertheless, Kumar et al. (2021) mentioned in their future work that the proposed data augmentation technique could be combined with latent space data augmentation, hoping that unifying different DA methods would inspire new approaches for universal data augmentation approach. Hence, we propose a hybrid architecture containing both text space and feature (latent) space. The purpose of this hybrid structure is to enhance model generalization capabilities significantly across various domains. Additionally, we observe the quality of generated data through data augmentation and how it affects the classification performance of intent labels to ensure substantial impact. Compared to the baseline setup that only implements text space data augmentation, experiment results demonstrate consistent improvement across all three datasets when applying our proposed hybrid architecture, which integrates text space and feature space data augmentation. Our approach shows strong generalization capabilities and effectiveness.