博碩士論文 109523070 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:51 、訪客IP:3.137.187.233
姓名 周國斌(Kuo-Pin Chou)  查詢紙本館藏   畢業系所 通訊工程學系
論文名稱 使用DQN深度學習演算法的O-RAN無線電單元頻寬使用率之負載平衡機制
(Deep Q-Learning Based Load Balancing Mechanism for Efficient Bandwidth Usage on Radio Unit in O-RAN Architecture)
相關論文
★ 非結構同儕網路上以特徵相似度為基準之搜尋方法★ 以階層式叢集聲譽為基礎之行動同儕網路拓撲架構
★ 線上RSS新聞資料流中主題性事件監測機制之設計與實作★ 耐延遲網路下具密度感知的路由方法
★ 整合P2P與UPnP內容分享服務之家用多媒體閘道器:設計與實作★ 家庭網路下簡易無縫式串流影音播放服務之設計與實作
★ 耐延遲網路下訊息傳遞時間分析與高效能路由演算法設計★ BitTorrent P2P 檔案系統下載端網路資源之可調式配置方法與效能實測
★ 耐延遲網路中利用訊息編碼重組條件之資料傳播機制★ 耐延遲網路中基於人類移動模式之路由機制
★ 車載網路中以資料匯集技術改善傳輸效能之封包傳送機制★ 適用於交叉路口環境之車輛叢集方法
★ 車載網路下結合路側單元輔助之訊息廣播機制★ 耐延遲網路下以靜態中繼節點(暫存盒)最佳化訊息傳遞效能之研究
★ 耐延遲網路下以動態叢集感知建構之訊息傳遞機制★ 跨裝置影音匯流平台之設計與實作
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   [檢視]  [下載]
  1. 本電子論文使用權限為同意立即開放。
  2. 已達開放權限電子全文僅授權使用者為學術研究之目的,進行個人非營利性質之檢索、閱讀、列印。
  3. 請遵守中華民國著作權法之相關規定,切勿任意重製、散佈、改作、轉貼、播送,以免觸法。

摘要(中) 在現今工業物聯網(IIoT) 場域中,IIoT 設備之大幅增加,導致網路流量需求逐年上升,因而使其之控管變得極為不易。有鑑於此,O-RAN 之網路架構的設計恰好可有效解決此問題,相較於傳統的無線接取網路(RAN),O-RAN 將網路進行模組化之優點可讓網路供應商根據不同的環境場域進行相應的網路架構之設計,但由於O-RAN 架構中會有數個基地台與網路設備間彼此互連,其網路流量具有龐大、多變與複雜等特性且每一網路設備皆需要不同之服務品質需求,因而較難以傳統之演算法來進行控制,導致基地台間的頻寬使用率分配不均。本研究將採用強化學習中的DQN 模型來解決此問題,此模型可透過與環境的互動來對輸出結果進行動態與持續性的調整,而本研究將此特性用以針對IIoT 設備之服務品質需求與相應的O-RAN 基地台來進行連線配對,藉此來控制每一基地台之頻寬使用率,從而在滿足服務品質需求的條件下達到負載平衡之目標。而於最終的實驗結果顯示,本研究所提出之結合DQN 之ORAN 架構在相同連線環境的狀態下,各基地台的資源塊使用率分配相較於傳統上以節省能源以及節省頻寬使用為主的演算法較為平衡,以防止高流量所造成的延遲發生。在設備服務率方面,當設備數目上升時,本研究之方法也能妥善分配資源塊使用,使設備服務率下降的程度相較其他方法來的平緩。
摘要(英) In Industrial Internet of Things (IIoT) field, the substantial increase of IIoT devices has led to an increase in network traffic and difficulty to management. The O-RAN architecture can effectively solve this problem. Compared with the traditional Radio Access Network (RAN), the advantages of O-RAN modularization of the network let the network vendors design corresponding network architectures for different environment . However, in O-RAN architecture, there will be several base stations interconnected with network equipment. The characteristics of variability and complexity, and each network device requires different QoS requirements, so it is more difficult to control with traditional algorithms, resulting in uneven allocation of bandwidth utilization among base stations. This research will use the DQN model in reinforcement learning to solve this problem. This model can dynamically and continuously adjust the output results through interaction with the environment, and this research uses the feature to meet the QoS requirements of IIoT devices. It is paired with the corresponding O-RAN base station to control the bandwidth utilization rate of each base station, so as to achieve the goal of load balancing based on QoS requirements. The results show that the ORAN architecture with DQN proposed in this paper is superior to the traditional energy-saving algorithm and bandwidth aware algorithm in various indicators such as load balancing, bandwidth utilization rate, and equipment service rate.
關鍵字(中) ★ 負載平衡
★ 頻寬資源分配
★ 強化學習
★ O-RAN
★ 邊緣計算
★ 工業物聯網
★ 行動計算
關鍵字(英) ★ Load Balancing
★ Bandwidth Resource Allocation
★ Reinforcement Learning
★ O-RAN
★ Edge Computing
★ Industrial IoT
★ Mobile Computing
論文目次 摘要i
Abstract ii
致謝iii
圖目錄vi
表目錄vii
1 簡介2
1.1 前言. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 研究動機. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2 背景與相關文獻探討4
2.1 O-RAN 架構. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2.1.1 O-RAN 中的SMO 與non-rt RIC 模塊. . . . . . . . . . . . . . . . 6
2.1.2 O-RAN 中的near-rt RIC 模塊. . . . . . . . . . . . . . . . . . . . 7
2.1.3 O-RAN 上的簡易架構實現. . . . . . . . . . . . . . . . . . . . . . 9
2.2 在O-RAN 上運行的強化學習技術. . . . . . . . . . . . . . . . . . . . . . 11
2.2.1 強化學習的基本介紹. . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.2 強化學習中的DQN 模型. . . . . . . . . . . . . . . . . . . . . . . 12
2.2.3 O-RAN 中強化學習的運行流程. . . . . . . . . . . . . . . . . . . . 14
2.2.4 O-RAN 中強化學習的準則. . . . . . . . . . . . . . . . . . . . . . 15
2.3 O-RAN 相關文獻之探討. . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

3 系統架構20
3.1 以O-RAN 為架構的IIoT 系統設計. . . . . . . . . . . . . . . . . . . . . 20
3.2 問題定義. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.3 基於DQN 之資源塊使用率之負載平衡演算法. . . . . . . . . . . . . . . . 26
4 實驗與結果分析32
4.1 實驗環境. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
4.2 實驗設計. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.2.1 資料預處理與輸入. . . . . . . . . . . . . . . . . . . . . . . . . . . 35
4.2.2 DQN 模型的訓練過程. . . . . . . . . . . . . . . . . . . . . . . . . 36
4.2.3 DQN 模型的收斂. . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
4.2.4 DQN 方法與全局運算法的比較. . . . . . . . . . . . . . . . . . . . 38
4.3 實驗結果與分析. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.3.1 實驗模擬結果. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
5 結論與未來研究44
參考文獻45
參考文獻 [1] A. Mahmood, L. Beltramelli, S. Fakhrul Abedin, S. Zeb, N. I. Mowla, S. A. Hassan,
E. Sisinni, and M. Gidlund, “Industrial iot in 5g-and-beyond networks: Vision, architecture,
and design trends,” IEEE Transactions on Industrial Informatics, vol. 18,
no. 6, pp. 4122–4137, 2022.
[2] T. ETSI, “123 501 v15. 2.0 (2018-06) 5g,” System Architecture for the 5G System
(Release 15).
[3] N. Kazemifard and V. Shah-Mansouri, “Minimum delay function placement and
resource allocation for open ran (o-ran) 5g networks,” Computer Networks, vol. 188,
p. 107809, 2021.
[4] O. R. Alliance, “O-ran : towards an open and smart ran,” White paper, vol. 19, 2018.
[5] O. Alliance, “O-ran architecture description,” O-RAN. WG1. O-RAN-Architecture-
Description-v03. 00, Technical Specification, 2021.
[6] ——, “O-ran non-rt ric: Functional architecture 1.01,” O-RAN. WG2. Non-RT-RICARCH-
TR-v01. 01,Technical Specification, 2021.
[7] ——, “O-ran near-rt ric architecture 2.01,” O-RAN.WG3.RICARCHv02.01,
Technical Specification, 2022.
[8] O.-R. B. DEMO, “https://wiki.o-ran-sc.org/display/GS/Getting+Started.”
[9] Kubernetes, “https://kubernetes.io/.”
[10] Docker, “https://www.docker.com/.”
[11] M. v. Otterlo and M. Wiering, “Reinforcement learning and markov decision processes,”
in Reinforcement learning. Springer, 2012, pp. 3–42.
[12] O. Alliance, “O-ran ai/ ml workflow description and requirements 1.03,” ORAN.
WG2.AIML-v01.03,Technical Specification, 2021.
[13] M. M. Hasan, S. Kwon, and J.-H. Na, “Adaptive mobility load balancing algorithm
for lte small-cell networks,” IEEE Transactions on Wireless Communications, vol. 17,
no. 4, pp. 2205–2217, 2018.
[14] M. V. G. Ferreira, F. H. T. Vieira, and D. C. Abrahão, “Minimizing delay in resource
block allocation algorithm of lte downlink,” in 2015 International Workshop
on Telecommunications (IWT), 2015, pp. 1–7.
[15] S. Hwang and S. Park, “On the effects of resource usage ratio on data rate in lte
systems,” in 2017 19th International Conference on Advanced Communication Technology
(ICACT), 2017, pp. 78–80.
[16] H.-C. Jang and Y.-J. Lee, “Qos-constrained resource allocation scheduling for lte network,”
in International Symposium on Wireless and pervasive Computing (ISWPC),
2013, pp. 1–6.
[17] N. Guan, Y. Zhou, L. Tian, G. Sun, and J. Shi, “Qos guaranteed resource block
allocation algorithm for lte systems,” in 2011 IEEE 7th International Conference on
Wireless and Mobile Computing, Networking and Communications (WiMob), 2011,
pp. 307–312.
[18] H. S. Jang, H. Lee, H. Kwon, and S. Park, “Deep learning-based prediction of resource
block usage rate for spectrum saturation diagnosis,” IEEE Access, vol. 9, pp. 59 703–
59 714, 2021.
[19] S. Messaoud, A. Bradai, O. B. Ahmed, P. T. A. Quang, M. Atri, and M. S. Hossain,
“Deep federated q-learning-based network slicing for industrial iot,” IEEE Transactions
on Industrial Informatics, vol. 17, no. 8, pp. 5572–5582, 2021.
[20] M. C. Lucas-Estañ and J. Gozalvez, “Load balancing for reliable self-organizing industrial
iot networks,” IEEE Transactions on Industrial Informatics, vol. 15, no. 9,
pp. 5052–5063, 2019.
[21] J. Li, G. Luo, N. Cheng, Q. Yuan, Z. Wu, S. Gao, and Z. Liu, “An end-to-end load
balancer based on deep learning for vehicular network traffic control,” IEEE Internet
of Things Journal, vol. 6, no. 1, pp. 953–966, 2019.
[22] O. Houidi, D. Zeghlache, V. Perrier, P. T. Anh Quang, N. Huin, J. Leguay, and
P. Medagliani, “Constrained deep reinforcement learning for smart load balancing,”
in 2022 IEEE 19th Annual Consumer Communications & Networking Conference
(CCNC), 2022, pp. 207–215.
[23] Y. Ren, Y. Sun, and M. Peng, “Deep reinforcement learning based computation offloading
in fog enabled industrial internet of things,” IEEE Transactions on Industrial
Informatics, vol. 17, no. 7, pp. 4978–4987, 2021.
[24] S. K. Singh, R. Singh, and B. Kumbhani, “The evolution of radio access network
towards open-ran: Challenges and opportunities,” in 2020 IEEE Wireless Communications
and Networking Conference Workshops (WCNCW), 2020, pp. 1–6.
[25] T. Pamuklu, S. Mollahasani, and M. Erol-Kantarci, “Energy-efficient and delayguaranteed
joint resource allocation and du selection in o-ran,” in 2021 IEEE 4th 5G
World Forum (5GWF), 2021, pp. 99–104.
[26] H. Lee, J. Cha, D. Kwon, M. Jeong, and I. Park, “Hosting ai/ml workflows on o-ran
ric platform,” in 2020 IEEE Globecom Workshops (GC Wkshps, 2020, pp. 1–6.
[27] A. Giannopoulos, S. Spantideas, N. Kapsalis, P. Gkonis, L. Sarakis, C. Capsalis,
M. Vecchio, and P. Trakadas, “Supporting intelligence in disaggregated open radio
access networks: Architectural principles, ai/ml workflow, and use cases,” IEEE
Access, vol. 10, pp. 39 580–39 595, 2022.
[28] Y. Cao, S.-Y. Lien, Y.-C. Liang, K.-C. Chen, and X. Shen, “User access control
in open radio access networks: A federated deep reinforcement learning approach,”
IEEE Transactions on Wireless Communications, vol. 21, no. 6, pp. 3721–3736, 2022.
[29] S. F. Abedin, A. Mahmood, N. H. Tran, Z. Han, and M. Gidlund, “Elastic o-ran
slicing for industrial monitoring and control: A distributed matching game and deep
reinforcement learning approach,” 2021.
[30] S. Mollahasani, M. Erol-Kantarci, and R. Wilson, “Dynamic cu-du selection for resource
allocation in o-ran using actor-critic learning,” in 2021 IEEE Global Communications
Conference (GLOBECOM), 2021, pp. 1–6.
[31] T. D. Tran, K.-K. Nguyen, and M. Cheriet, “Joint route selection and content caching
in o-ran architecture,” in 2022 IEEE Wireless Communications and Networking Conference
(WCNC), 2022, pp. 2250–2255.
[32] L. Bonati, S. D’Oro, M. Polese, S. Basagni, and T. Melodia, “Intelligence and learning
in o-ran for data-driven nextg cellular networks,” IEEE Communications Magazine,
vol. 59, no. 10, pp. 21–27, 2021.
[33] K. M. Addali, S. Y. Bani Melhem, Y. Khamayseh, Z. Zhang, and M. Kadoch, “Dynamic
mobility load balancing for 5g small-cell networks based on utility functions,”
IEEE Access, vol. 7, pp. 126 998–127 011, 2019.
[34] T. ETSI, “138 101-1 v15. 2.0,” 5g.”
[35] S.-Z. Huang, K.-Y. Lin, and C.-L. Hu, “Intelligent task migration with deep qlearning
in multi-access edge computing,” IET Communications, vol. 16, no. 11, pp. 1290–
1302, 2022.
[36] R. Bellman, “Dynamic programming. mineola,” 2003.
[37] Colab, “https://colab.research.google.com/notebooks/welcome.ipynb.”
[38] J. Notebook, “https://jupyter.org/.”
[39] TensorFlow, “https://www.tensorflow.org/?hl=zh-tw.”
[40] Keras, “https://keras.io/.”
[41] H. Yang, A. Alphones, W.-D. Zhong, C. Chen, and X. Xie, “Learning-based energyefficient
resource management by heterogeneous rf/vlc for ultra-reliable low-latency
industrial iot networks,” IEEE Transactions on Industrial Informatics, vol. 16, no. 8,
pp. 5565–5576, 2020.
[42] H. Jiang, Y. Niu, B. Ai, Z. Zhong, and S. Mao, “Qos-aware bandwidth allocation
and concurrent scheduling for terahertz wireless backhaul networks,” IEEE Access,
vol. 8, pp. 125 814–125 825, 2020.
指導教授 胡誌麟(Chih-Lin Hu) 審核日期 2022-9-20
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明