由於計算數據的快速增長,基於DRAM的主要存儲裝置無法容納來自數據密集型應用(如機器學習算法和推薦系統)的所有待處理數據。因此,主要存儲裝置和下層存儲設備之間的數據移動導致了一個重要的性能問題。當傳統的基於NAND的固態硬碟(SSD)應用於計算機架構時,性能問題無法得到解決,因為存儲驅動器無法區分來自主機系統的數據類型。然而,一種新型的存儲介質,即開放通道固態硬盤(OCSSD),已經被提出來,提供了一條從主機端系統優化數據在存儲空間上放置的路徑。在這項研究中,我們為一個著名的數據密集型應用(即深度學習推薦系統(DLRM))在OCSSD存儲驅動器上開發了一個新的存取數據模型。我們的解決方案被稱為OC-DLRM,通過I/O單元將經常訪問的數據放在一起,可以最大限度地減少對快閃記憶體的I/O流量。根據我們的實驗結果,與傳統的虛擬內存管理方案相比,OC-DLRM明顯減少了記憶體和存儲設備之間的I/O流量。;Due to the rapid growth of computing data, DRAM-based main memory cannot accommodate all to-be-processed data from data-intensive applications (e.g., machine learning algorithms and recommendation systems). Therefore, data movement between main memory and a storage device results in a significant performance issue. When a traditional NAND-based solid-state drive (SSD) is applied to a computer architecture, the performance issue cannot be tackled because a storage drive cannot distinguish the types of data from the host system. However, a new type storage medium, namely open-channel SSD (OCSSD), has been proposed to provide a path to optimize data placement on the storage space from the host-side system. In this study, we develop a new data access model for a well-known data-intensive application (i.e., deep learning recommendation system (DLRM)) on an OCSSD storage drive. Our solution, called OC-DLRM, can minimize the I/O traffic to the flash memory storage device by considering the I/O unit of a flash memory drive to place the frequently-accessed data together. According to our experimental results, the OC-DLRM significantly decrease the amount of I/O traffic between memory and storage devices, compared with the traditional virtual memory management solution.