參考文獻 |
[1] Z. Wu, S. Pan, G. Long, J. Jiang, and C. Zhang, “Graph wavenet for deep spatialtemporal graph modeling,” arXiv preprint arXiv:1906.00121, 2019.
[2] B. Yu, H. Yin, and Z. Zhu, “Spatio-temporal graph convolutional networks: A deep
learning framework for traffic forecasting,” arXiv preprint arXiv:1709.04875, 2017.
[3] 行 政 院 交 通 環 境 資 源 處, “空 污 感 測 物 聯 網 應 用 於 環 保 稽 查 推 動
成果.” https://www.ey.gov.tw/Page/448DE008087A1971/4d1b964c-9294-4505-
8814-6d14d57ae05d.
[4] 行政院環保署, “空氣品質感測器架構圖.” https://img.ltn.com.tw/Upload/
news/600/2017/07/03/118.jpg.
[5] S. Glantz and B. Slinker, Primer of Applied Regression & Analysis of Variance, ed.
McGraw-Hill, Inc., New York, 2001.
[6] L.-J. Chen, Y.-H. Ho, H.-H. Hsieh, S.-T. Huang, H.-C. Lee, and S. Mahajan, “Adf:
An anomaly detection framework for large-scale pm2. 5 sensing systems,” IEEE
Internet of Things Journal, vol. 5, no. 2, pp. 559–570, 2017.
[7] Z. Qi, T. Wang, G. Song, W. Hu, X. Li, and Z. Zhang, “Deep air learning: Interpolation, prediction, and feature analysis of fine-grained air quality,” IEEE Transactions
on Knowledge and Data Engineering, vol. 30, no. 12, pp. 2285–2297, 2018.
[8] S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural computation,
vol. 9, no. 8, pp. 1735–1780, 1997.
[9] Y.-S. Chang, H.-T. Chiao, S. Abimannan, Y.-P. Huang, Y.-T. Tsai, and K.-M. Lin,
“An lstm-based aggregated model for air pollution forecasting,” Atmospheric Pollution Research, vol. 11, no. 8, pp. 1451–1463, 2020.
[10] Y. Li, R. Yu, C. Shahabi, and Y. Liu, “Diffusion convolutional recurrent neural
network: Data-driven traffic forecasting,” arXiv preprint arXiv:1707.01926, 2017.
[11] J. Chung, C. Gulcehre, K. Cho, and Y. Bengio, “Empirical evaluation of gated
recurrent neural networks on sequence modeling,” arXiv preprint arXiv:1412.3555,
2014.
[12] A. v. d. Oord, S. Dieleman, H. Zen, K. Simonyan, O. Vinyals, A. Graves, N. Kalchbrenner, A. Senior, and K. Kavukcuoglu, “Wavenet: A generative model for raw
audio,” arXiv preprint arXiv:1609.03499, 2016.
[13] F. Yu and V. Koltun, “Multi-scale context aggregation by dilated convolutions,”
arXiv preprint arXiv:1511.07122, 2015.
[14] M. Deza and E. Deza, “Encyclopedia of distances,” 2014.
[15] D. I. Shuman, S. K. Narang, P. Frossard, A. Ortega, and P. Vandergheynst, “The
emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains,” IEEE Signal Processing Magazine,
vol. 30, no. 3, pp. 83–98, 2013.
[16] W. N. Anderson Jr and T. D. Morley, “Eigenvalues of the laplacian of a graph,”
Linear and multilinear algebra, vol. 18, no. 2, pp. 141–145, 1985.
[17] M. Defferrard, X. Bresson, and P. Vandergheynst, “Convolutional neural networks
on graphs with fast localized spectral filtering,” arXiv preprint arXiv:1606.09375,
2016.
[18] A. Micheli, “Neural network for graphs: A contextual constructive approach,” IEEE
Transactions on Neural Networks, vol. 20, no. 3, pp. 498–511, 2009.
[19] J. Atwood and D. Towsley, “Diffusion-convolutional neural networks,” in Advances
in neural information processing systems, pp. 1993–2001, 2016.
[20] Y. N. Dauphin, A. Fan, M. Auli, and D. Grangier, “Language modeling with gated
convolutional networks,” in International conference on machine learning, pp. 933–
941, PMLR, 2017.
[21] 行政院環境保護署, “認識空污感測物聯網.” https://airtw.epa.gov.tw/CHT/
Encyclopedia/AirSensor/AirSensor_2.aspx.
[22] J. A. Hanley and B. J. McNeil, “A method of comparing the areas under receiver
operating characteristic curves derived from the same cases.,” Radiology, vol. 148,
no. 3, pp. 839–843, 1983.
[23] K. H. Zou, A. J. O’Malley, and L. Mauri, “Receiver-operating characteristic analysis
for evaluating diagnostic tests and predictive models,” Circulation, vol. 115, no. 5,
pp. 654–657, 2007.
[24] T. K. Ho, “The random subspace method for constructing decision forests,” IEEE
transactions on pattern analysis and machine intelligence, vol. 20, no. 8, pp. 832–844,
1998.
[25] T. Ho, “A data complexity analysis of comparative advantages of decision forest
constructors,” Pattern Anal. Appl., vol. 5, pp. 102–112, 06 2002.
[26] R. Tibshirani, “Regression shrinkage and selection via the lasso,” Journal of the
Royal Statistical Society: Series B (Methodological), vol. 58, no. 1, pp. 267–288,
1996.
[27] L. Breiman, “Better subset regression using the nonnegative garrote,” Technometrics,
vol. 37, pp. 373–384, 1995.
[28] D. E. Hilt and D. W. Seegrist, Ridge, a computer program for calculating ridge regression estimates, vol. 236. Department of Agriculture, Forest Service, Northeastern
Forest Experiment …, 1977.
[29] M. H. Gruber, Improving efficiency by shrinkage: the James-Stein and ridge regression estimators. Routledge, 2017.
[30] A. E. Hoerl and R. W. Kennard, “Ridge regression: Biased estimation for nonorthogonal problems,” Technometrics, vol. 12, no. 1, pp. 55–67, 1970.
[31] Y. Chauvin and D. E. Rumelhart, Backpropagation: theory, architectures, and applications. Psychology press, 1995.
[32] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by
back-propagating errors,” nature, vol. 323, no. 6088, pp. 533–536, 1986.
[33] S. Bai, J. Z. Kolter, and V. Koltun, “An empirical evaluation of generic convolutional
and recurrent networks for sequence modeling,” arXiv preprint arXiv:1803.01271,2018. |