博碩士論文 111522138 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:47 、訪客IP:3.144.100.197
姓名 黃意勛(Yi-Hsun Huang)  查詢紙本館藏   畢業系所 資訊工程學系
論文名稱
(Planting a Forest in Sky: Harnessing Parallelism in Skyrmion Racetrack Memory for Efficient Random Forest Data Placement)
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 (2027-6-1以後開放)
摘要(中) 斯格明子賽道記憶體(Sky-RM)因其高存儲密度的潛力而備受關注,
尤其是對於資料儲存需求不斷增長的現今尤為重要。Sky-RM 利用斯格明子
的奈米尺度尺寸,與傳統記憶體技術相比顯著減少了物理空間需求。然而,
由於其獨特的特性,包括位移、生成和消除,如果缺乏有效的演算法來管
理這些操作,Sky-RM 的整體性能可能顯著低於傳統的動態隨機存取記憶體
(DRAM)。
因此,本論文致力於開發一種具有高並行性、低延遲、低能耗和高空
間利用率的資料擺放方式。此外,我們將我們的方法應用到基於隨機森林
的機器學習框架中,以檢驗是否實現了高平行度以及延遲和能耗的減少。
摘要(英) Skyrmion-based Racetrack Memory(Sky-RM) has gained attention due to its potential to offer high storage density, which is increasingly critical as the demand for data storage continues to grow. Sky-RM leverages the nanoscale size of skyrmions, allowing for a significant reduction in physical space requirements compared to traditional memory technologies. However, due to its unique characteristics, including shifting, generating, and eliminating, Sky-RM can exhibit significantly poorer overall performance compared to traditional DRAM if it lacks a robust algorithm to manage
these operations effectively. Therefore, this paper is dedicated to develop a high parallelism, low latency, low
energy consumption and high space utilization placement strategy. Furthermore, we integrate our method onto the random forest based machine learning framework to see whether high parallelism and reductions in latency and energy consumption have been achieved.
關鍵字(中) ★ 斯格明子
★ 賽道記憶體
★ 隨機森林
★ 高平行度
關鍵字(英) ★ Skyrmion
★ Racetrack memory
★ Random forest
★ Parallelism
論文目次 1 Introduction 1
2 Technical Background & Motivation 3
2.1 Skyrmion-based Racetrack Memory . . . . . . . . . . . 3
2.2 Random Forest-based Machine Learning . . . . . . . . 5
2.3 Related Work . . . . . . . . . . . . . . . . . . . . 5
2.4 Motivation . . . . . . . . . . . . . . . . . . . . . 7
3 Method 9
3.1 Method Overview . . . . . . . . . . . . . . . . . . . 9
3.2 Parallel Data Placement . . . . . . . . . . . . . . .10
3.2.1 Parallel In Skyrmion Racetrack Memory Unit(PIU) . 10
3.2.2 Parallel On Skyrmion Racetrack Memory Unit(POU) . .11
3.3 Node instance sort . . . . . . . . . . . . . . . . . 11
3.4 Buffer management . . . . . . . . . . . . . . . . . .12
3.4.1 Buffer’s architecture . . . . . . . . . . . . . . .12
3.4.2 Data region . . . . . . . . . . . . . . . . . . . .13
3.4.3 PIU buffer basic set up . . . . . . . . . . . . . .13
3.4.4 POU buffer basic set up . . . . . . . . . . . . . 14
3.5 Data writing . . . . . . . . . . . . . . . . . . . . 14
3.5.1 PIU data writing . . . . . . . . . . . . . . . . . 14
3.5.2 POU data writing . . . . . . . . . . . . . . . . . 15
3.6 Data Reading . . . . . . . . . . . . . . . . . . . . 15
4 Experiments 16
4.1 Environmental Settings . . . . . . . . . . . . . . . 16
4.2 Findings and Results . . . . . . . . . . . . . . . . 17
4.2.1 Execution Latency . . . . . . . . . . . . . . . . .18
4.2.2 Execution Energy Consumption . . . . . . . . . . . 19
4.2.3 Space Utilization . . . . . . . . . . . . . . . . 20
5 Concluding Remarks 22
Bibliography 23
參考文獻 [1] Ya-Hui Yang, Shuo-Han Chen, and Yuan-Hao Chang. “Evolving Skyrmion
Racetrack Memory as Energy-Efficient Last-Level Cache Devices”. In: ISLPED
’22. Boston, MA, USA: Association for Computing Machinery, 2022. ISBN:
9781450393546.
[2] Wang Kang et al. “A Comparative Cross-Layer Study on Racetrack Memories:
Domain Wall vs Skyrmion”. In: J. Emerg. Technol. Comput. Syst. 16.1 (2019). ISSN:
1550-4832.
[3] Fan Chen et al. “Process variation aware data management for magnetic
skyrmions racetrack memory”. In: 2018 23rd Asia and South Pacific Design Automation Conference (ASP-DAC). IEEE. 2018, pp. 221–226.
[4] Wang Kang et al. “Complementary skyrmion racetrack memory with voltage
manipulation”. In: IEEE Electron Device Letters 37.7 (2016), pp. 924–927.
[5] Gérard Biau and Erwan Scornet. “A random forest guided tour”. In: Test 25
(2016), pp. 197–227.
[6] Steven J Rigatti. “Random forest”. In: Journal of Insurance Medicine 47.1 (2017),
pp. 31–39.
[7] Yan-Yan Song and LU Ying. “Decision tree methods: applications for classification and prediction”. In: Shanghai archives of psychiatry 27.2 (2015), p. 130.
[8] Yun-Shan Hsieh et al. “Shift-limited sort: Optimizing sorting performance on
skyrmion memory-based systems”. In: IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 39.11 (2020), pp. 4115–4128.
[9] Tsun-Yu Yang et al. “Permutation-write: Optimizing write performance and energy for skyrmion racetrack memory”. In: 2020 57th ACM/IEEE Design Automation Conference (DAC). IEEE. 2020, pp. 1–6.
[10] Christian Hakert et al. “ROLLED: Racetrack memory optimized linear layout
and efficient decomposition of decision trees”. In: IEEE Transactions on Computers 72.5 (2022), pp. 1488–1502.
[11] Asif Ali Khan et al. “Shiftsreduce: Minimizing shifts in racetrack memory 4.0”.
In: ACM Transactions on Architecture and Code Optimization (TACO) 16.4 (2019),
pp. 1–23.
[12] Xianzhang Chen et al. “Efficient data placement for improving data access performance on domain-wall memory”. In: IEEE Transactions on Very Large Scale
Integration (VLSI) Systems 24.10 (2016), pp. 3094–3104.
[13] Marvin N Wright and Andreas Ziegler. “ranger: A fast implementation of
random forests for high dimensional data in C++ and R”. In: arXiv preprint
arXiv:1508.04409 (2015).
[14] Li Deng. “The mnist database of handwritten digit images for machine learning research [best of the web]”. In: IEEE signal processing magazine 29.6 (2012),
pp. 141–142.
[15] Aadarsh velu. AIDS Virus Infection Prediction. Website. https : / / www .
kaggle . com / datasets / aadarshvelu / aids - virus - infection -
prediction. 1996.
[16] Subhadeep Chakraborty. Wine Quality Data (Combined). Website. https : / /
www.kaggle.com/datasets/subhajournal/wine- quality- datacombined. 2023.
[17] PRATHAM TRIPATHI. Calculate Concrete Strength. Website. https : / / www .
kaggle . com / datasets / prathamtripathi / regression - with -
neural-networking. 2020.
指導教授 陳增益 陳增益(Tseng-Yi Chen Tseng-Yi Chen) 審核日期 2024-7-23
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明