摘要(英) |
Due to the problems caused by global warming, many countries have initiated carbon emission restrictions, leading to the emergence of carbon trading markets and a growing demand for accurate carbon price predictions. In the realm of time series forecasting, various methods exist to improve prediction accuracy. Predicting carbon pricing is difficult due to several influential factors. These factors encompass political and economic conditions, technological advancements, evolving climate policies, fluctuating fossil fuel costs, the availability of renewable energy alternatives, and the effectiveness of climate policies in reducing carbon emissions. Furthermore, limited historical data exists for accurate predictions, as carbon pricing mechanisms like cap-and-trade systems and carbon taxes are relatively new policies in many countries. Our approach focuses on effectively incorporating news information to enhance the predictive capabilities of the Transformer-based time series model by introducing meaningful news information based on several influential factors, including government policies, market supply and demand, economic conditions, energy prices, climate change events, and investor sentiment. To achieve this, we employ ChatGPT to transform news data into multiple strength indicators based on different perspectives that contribute to price increases. This provides us with more informative news inputs for the model. Furthermore, we propose NTEformer, a novel way, compared to the most widely used early concatenation method, to combine text data into time series model with News-Trend Extractor, designed to better leverage news information. We augment the decoder of the model with a News-Trend Extractor, specifically designed to learn from news data. Through a series of experiments, we compare NTEformer with other methods that combine time series data and text, demonstrating its superior performance.
By effectively integrating news information and leveraging the expertise of the News-Trend Extractor, the NTEformer model offers enhanced capabilities in predicting carbon trading prices, reducing the overall error by a remarkable 28% compared to the method without using news information. The comprehensive analysis and experiments validate the effectiveness of our approach, showcasing its ability to outperform other fusion strategy. |
參考文獻 |
[1] D. Salinas, V. Flunkert, and J. Gasthaus, “Deepar: Probabilistic forecasting with autoregressive recurrent networks,” Feb. 2019.
[2] G.Lai,W.-C.Chang,Y.Yang,andH.Liu,“Modelinglong-andshort-termtemporal patterns with deep neural networks,” Apr. 2018.
[3] A. Borovykh, S. Bohte, and C. W. Oosterlee, “Conditional time series forecasting with convolutional neural networks,” Sep. 2018.
[4] R. Sen, H.-F. Yu, and I. Dhillon, “Think globally, act locally: A deep neural network approach to high-dimensional time series forecasting,” Oct. 2019.
[5] H. Wu, J. Xu, J. Wang, and M. Long, “Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting,” Jan. 2022.
[6] H. Zhou, S. Zhang, J. Peng, S. Zhang, J. Li, H. Xiong, and W. Zhang, “Informer: Beyond efficient transformer for long sequence time-series forecasting,” Mar. 2021.
[7] G. Woo, C. Liu, D. Sahoo, A. Kumar, and S. Hoi, “Etsformer: Exponential smooth- ing transformers for time-series forecasting,” Jun. 2022.
[8] T. Zhou, Z. Ma, Q. Wen, X. Wang, L. Sun, and R. Jin, “Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting,” Jun. 2022.
[9] Y. Liu, H. Wu, J. Wang, and M. Long, “Non-stationary transformers: Exploring the stationarity in time series forecasting,” Oct. 2022.
[10] D. Du, B. Su, and Z. Wei, “Preformer: Predictive transformer with multi-scale segment-wise correlations for long-term time series forecasting,” Feb. 2022.
[11] Z.Liu,Y.Li,andH.Liu,“Fuzzytime-seriespredictionmodelbasedontextfeatures and network features,” Neural Computing and Applications, Mar. 2021.
[12] N. Kanungsukkasem and T. Leelanupab, “Financial latent dirichlet allocation (finlda): Feature extraction in text and data mining for financial time series pre- diction,” IEEE Access, vol. 7, pp. 71 645–71 664, 2019.
[13] K. Keswani, I. Das, B. Shrivastava, A. Gupta, and R. Katarya, “Lda based model for mining textual features from financial news articles,” in 2020 2nd International Conference on Advances in Computing, Communication Control and Networking (ICACCCN). Greater Noida, India: IEEE, Dec. 2020, pp. 43–48.
[14] P.Xu,X.Zhu,andD.A.Clifton,“Multimodallearningwithtransformers:Asurvey,” Jun. 2022.
[15] A. Lopez-Lira and Y. Tang, “Can chatgpt forecast stock price movements? return predictability and large language models,” Rochester, NY, Apr. 2023.
[16] Y. Hao, C. Tian, and C. Wu, “Modelling of carbon price in two real carbon trading markets,” Journal of Cleaner Production, vol. 244, p. 118556, 2020.
[17] L.-T. Zhao, J. Miao, S. Qu, and X.-H. Chen, “A multi-factor integrated model for carbon price forecasting: Market interaction promoting carbon emission reduction,” Science of The Total Environment, vol. 796, p. 149110, 2021.
[18] M. Yahşi, E. Çanakoğlu, and S. Ağralı, “Carbon price forecasting models based on big data analytics,” Carbon Management, vol. 10, no. 2, pp. 175–187, 2019.
[19] B. Box, G. Jenkins, G. Reinsel, and G. Ljung, Time Series Analysis: Forecasting and Control, Jan. 2016, vol. 68.
[20] A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. Kaiser, and I. Polosukhin, “Attention is all you need,” Dec. 2017.
[21] S. Li, X. Jin, Y. Xuan, X. Zhou, W. Chen, Y.-X. Wang, and X. Yan, “Enhancing the locality and breaking the memory bottleneck of transformer on time series forecast- ing,” Jan. 2020.
[22] Y. Yang and J. Lu, “A fusion transformer for multivariable time series forecasting: The mooney viscosity prediction case,” Entropy, vol. 24, no. 4, p. 528, Apr. 2022.
[23] D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” Jan. 2017. |