博碩士論文 108423061 完整後設資料紀錄

DC 欄位 語言
DC.contributor資訊管理學系zh_TW
DC.creator柯伯叡zh_TW
DC.creatorBo-Ruei Keen_US
dc.date.accessioned2021-8-11T07:39:07Z
dc.date.available2021-8-11T07:39:07Z
dc.date.issued2021
dc.identifier.urihttp://ir.lib.ncu.edu.tw:88/thesis/view_etd.asp?URN=108423061
dc.contributor.department資訊管理學系zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract利用消息面預測未來股價趨勢的過往研究中,許多學者在自然語言的處理上多採用靜態表示方式的詞嵌入方法。為了瞭解動態表示方式的詞嵌入方法是否適用基於消息面訊息的股價預測任務上,本研究蒐集了兩間報社(Barron、Reuters)的資料並以蘋果公司(AAPL)及微軟公司(MSFT)為預測標的,搭配兩種動態表示方式的詞嵌入(Sentence-BERT、BERT)與三種靜態表示方式的詞嵌入方法(paragraph Vector、Word2Vec、TF-IDF),探討不同詞嵌入方法對於結果的影響。此外,有鑑於消息面中每個新聞事件對股價的影響力均不一致,本研究提出一個基於注意力機制與層標準化的長短期記憶模型(Attention mechanism and Layer normalization-based LSTM, AL_LSTM),將注意力集中在股票漲跌貢獻較大的新聞事件上,藉此幫助模型掌握關鍵訊息。本研究發現在整體平均下,詞嵌入方法Sentence-BERT表示消息面時在準確度上有正面的影響,並且最高準確度達69.07%。而本研究提出的AL_LSTM相較於深度學習模型LSTM和機器學習模型SVM,平均在準確度上能分別提升4.27%及6.32%,能有效預測未來股價趨勢的變化。zh_TW
dc.description.abstractIn previous researches using news to predict future stock price trends, many scholars have adopted the word embedding method of static representation in natural language processing. In order to understand the applicability of the word embedding method of dynamic representation in the task of stock price prediction based on news information. We collected data from two newspapers (Barron, Reuters) and used Apple (AAPL) and Microsoft (MSFT) as the forecast targets, with two dynamic representations of word embedding methods (Sentence-BERT, BERT) and three static representations of word embedding methods (paragraph Vector, Word2Vec, TF-IDF) to explore the impact of different word embedding methods on the prediction performance. In addition, because each news event has a different impact on the stock price trend, this study proposes an Attention mechanism and Layer normalization-based LSTM (AL_LSTM) to focus attention on news events that have a greater contribution to the stock price trend, thereby helping the model understand key information. This study found that under the overall average, using Sentence-BERT as the word embedding method for news messages has a positive effect on accuracy, and the highest accuracy is 69.07%. The accuracy of the AL_LSTM proposed in this study is 4.27% and 6.32% higher than the deep learning model LSTM and the machine learning model SVM, which can effectively predict future stock price changes.en_US
DC.subjectSentence-BERTzh_TW
DC.subject動態詞嵌入zh_TW
DC.subject注意力機制zh_TW
DC.subject層標準化zh_TW
DC.subject股價預測zh_TW
DC.subjectSentence-BERTen_US
DC.subjectdynamic word embeddingen_US
DC.subjectattention mechanismen_US
DC.subjectlayer normalizationen_US
DC.subjectstock price predictionen_US
DC.subjectattentionen_US
DC.title結合注意力機制與層標準化的神經網路於股價預測之研究zh_TW
dc.language.isozh-TWzh-TW
DC.titleCombining attention mechanism with layer normalized neural network in stock price forecasting: a case study of electronics industryen_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明