博碩士論文 110426006 完整後設資料紀錄

DC 欄位 語言
DC.contributor工業管理研究所zh_TW
DC.creator游景翔zh_TW
DC.creatorChing-Siang Youen_US
dc.date.accessioned2023-7-10T07:39:07Z
dc.date.available2023-7-10T07:39:07Z
dc.date.issued2023
dc.identifier.urihttp://ir.lib.ncu.edu.tw:88/thesis/view_etd.asp?URN=110426006
dc.contributor.department工業管理研究所zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract時間序列分析和預測是資料探勘中的一個重要領域。時間序列資料是在統一的時間間隔內收集大量的數據值,例如年、月、週、日等。透過分析時間序列,我們可以預測數據的變化,並提供未來資料的預測。近年來,時間序列預測一直是研究的焦點,並在機器學習和人工智能領域引發了各種研究和發展。隨著數據可用性的增加和計算能力的提升,也出現了許多基於深度學習的模型。且因為不同領域間的多樣性,衍伸許多不同的深度學習模型。時間序列趨勢預測一直是一個重要的課題,其結果可以為各領域的應用提供基礎,例如生產計劃的控制和優化等。 變換器模型(Transformer Model)最初是為了處理自然語言而提出的一種神經網絡架構。它使用一種稱為注意力或自我注意力(Self-attention)的機制來檢測序列中元素之間的相互影響和相互依賴關係。在本研究中,我們將變換器模型應用於時間序列資料的預測,並探討其並行計算的特性是否能解決長短期記憶模型(LSTM)在學習長序列時的限制。 此外,我們通過使用不同的位置編碼機制(Positional Encoding)來提供時間序列資料在序列中的位置信息,並探討不同的位置編碼方式對於時間序列預測在變換器模型中的影響。我們在實驗中使用了五種真實的時間序列資料,評估各種模型對於不同時間趨勢的預測結果。zh_TW
dc.description.abstractTime series analysis and forecasting are essential components of data mining. Time series data refers to a collection of data values gathered at regular time intervals, such as yearly, monthly, weekly, or daily intervals. By analyzing time series data, we can predict the changes occurring within the dataset and forecast future data trends. Time series prediction has been a research hotspot in the past ten years, with the increase in data availability and the improvement of computing power, many deep learning-based models have also emerged in recent years, and many different deep learning model designs have also emerged considering the diversity of time series problems between other domains. Time series trend forecasting has always been an important topic. The predicted results can provide the basis for applications in various fields, such as the control and optimization of production planning. Transformer Model is a kind of neural network, which was mainly applied to natural language processing when it was first proposed. It primarily uses a set of mechanisms called attention or self-attention to detect data elements in sequences that influence and depend on each other. In this study, we use the Transformer model to predict time series data and explore whether its parallel operation characteristics can solve the long-short-term memory model (LSTM) with a certain length limit in sequence learning. In addition, we use different Positional Encoding mechanisms to give time series data position information in the sequence and discuss the impact of different position encoding methods to express the positional relationship of time series data at time points on the time series prediction of the transformer model. In Chapter 4, we used 5 kinds of real-world time series data to examine each model′s results to predict different time trends.en_US
DC.subject資料探勘zh_TW
DC.subject深度學習zh_TW
DC.subject時間序列zh_TW
DC.subject變換器模型zh_TW
DC.subjectData miningen_US
DC.subjectDeep learningen_US
DC.subjectTime seriesen_US
DC.subjectTransformer modelen_US
DC.title時間序列預測中變換器架構之位置編碼設計zh_TW
dc.language.isozh-TWzh-TW
DC.titleDesign of Transformer Architecture based on different Position Encoding in Time series forecastingen_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明