dc.description.abstract | With the rapid development of network technology and the proliferation of 5G networks and various cloud services, the number of smartphones, smart wearables, and Internet of Things (IoT) devices is growing exponentially. The digitization of personal information, financial transactions, and payment methods has brought significant convenience to people while providing more opportunities and means for hackers to launch attacks. As a result, the importance and practicality of information security have become critical. To meet the high speed and low latency demands of modern networks, the response time of Intrusion Detection Systems (IDS) will be a crucial indicator. Traditional detection methods rely on analyzing high-dimensional data, which is computationally expensive and fails to meet real-time requirements. The feasibility of deploying complex models on edge devices also remains uncertain because such devices typically lack robust computing power.
To address the high computational cost and response time of traditional detection methods, this paper proposes an efficient hybrid model(Encoder and Multi-head Attention, EMA). The model uses an autoencoder to reduce the dimensionality of the original network traffic, enabling low-dimensional data to represent the original data more efficiently and reducing computational costs significantly. It then employs a multi-head attention mechanism to identify key factors and strengthen their weights by calculating the relationships between features in the low-dimensional data. Through residual connections, the model achieves data augmentation, solving the problem of significant information loss that can result from dimensionality reduction.
To validate the effectiveness of the proposed method, this paper conducted experimental tests using the UNSW-NB15 dataset. The experimental results indicate that, compared to the best-performing GRU model in traditional intrusion detection methods, the accuracy-prioritized EMA model can maintain an accuracy rate of 98.48% with low computational cost, reduce training time by 85.41%, prediction time by 60.24%, peak CPU usage by 15.20%, and average CPU usage by 42.31%. Meanwhile, the speed-prioritized EMA model, by sacrificing 2.10% accuracy, can reduce training time by 93.13%, prediction time by 64.69%, peak CPU usage by 29.48%, and average CPU usage by 42.31%. This significantly reduces the high computational cost and response time that have been criticized in traditional detection methods, enhancing the feasibility of deploying the model on edge devices with low computational power and providing an efficient and practical solution for modern network security protection. | en_US |