參考文獻 |
[1]. Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., & Bengio, Y. (2014). Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078.
[2]. Le, X.‐H., Ho, H. V., Lee, G., & Jung, S. (2019). Application of long short‐term memory (LSTM) neural network for flood forecasting. Water, 11(7), 1387.
[3]. Mitchell, Tom (1997). Machine Learning. New York: McGraw Hill. ISBN 0–07–042807–7. OCLC 36417892.
[4]. Pearson, K. (1901) On Lines and planes of closest fit to systems of points in space, Philosophical Magz., v.2(6), pp.559-572.
[5]. Sutton, R. S., & Barto, A. G. (2018). Reinforcement learning: An introduction. MIT press.
[6]. Xiang, Z., Yan, J., & Demir, I. (2020). A rainfall‐runoff model with LSTM‐based sequence‐to‐sequence learning. Water resources research, 56(1), e2019WR025326.
[7]. 石岡壩(2019)。關於石岡壩。上網日期:2022年2月21日。檢自:https://www.wracb.gov.tw/47808/47809/47810/49514/
[8]. 張炎銘、林廷芳、高穆賓主編. 閱讀水庫行腳台灣:探訪隱身山林的灰色建築. 台北市: 三聯科技教育基金會. (2012). ISBN 978-986-84878-5-7.
[9]. McCulloch, W. S., & Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. The bulletin of mathematical biophysics, 5(4), 115-133.
[10]. Wang, S. C. (2003). Artificial neural network. In Interdisciplinary computing in java programming (pp. 81-100). Springer, Boston, MA.
[11]. Kalchbrenner, N., & Blunsom, P. (2013, October). Recurrent continuous translation models. In Proceedings of the 2013 conference on empirical methods in natural language processing (pp. 1700-1709).
[12]. Mikolov, T. (2012). Statistical language models based on neural networks. Presentation at Google, Mountain View, 2nd April, 80(26).
[13]. Graves, A., Mohamed, A. R., & Hinton, G. (2013, May). Speech recognition with deep recurrent neural networks. In 2013 IEEE international conference on acoustics, speech and signal processing (pp. 6645-6649). Ieee.
[14]. Dahl, G. E., Yu, D., Deng, L., & Acero, A. (2011). Context-dependent pre-trained deep neural networks for large-vocabulary speech recognition. IEEE Transactions on audio, speech, and language processing, 20(1), 30-42.
[15]. Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012). Imagenet classification with deep convolutional neural networks. Advances in neural information processing systems, 25.
[16]. Olah, C. Understanding LSTM Networks. Available online: http://colah.github.io/posts/2015-08- Understanding-LSTMs/ (accessed on 23 February 2022).
[17]. Yu, Y., Si, X., Hu, C., & Zhang, J. (2019). A review of recurrent neural networks: LSTM cells and network architectures. Neural computation, 31(7), 1235-1270.
[18]. Wu, X., Kumar, V., Ross Quinlan, J., Ghosh, J., Yang, Q., Motoda, H., ... & Steinberg, D. (2008). Top 10 algorithms in data mining. Knowledge and information systems, 14(1), 1-37.
[19]. Lin, W. C., & Tsai, C. F. (2020). Missing value imputation: a review and analysis of the literature (2006–2017). Artificial Intelligence Review, 53(2), 1487-1509.
[20]. Zhang, S., Li, X., Zong, M., Zhu, X., & Cheng, D. (2017). Learning k for knn classification. ACM Transactions on Intelligent Systems and Technology (TIST), 8(3), 1-19.
[21]. Zhang, C., Zhu, X., Zhang, J., Qin, Y., & Zhang, S. (2007, May). GBKII: An imputation method for missing values. In Pacific-Asia Conference on Knowledge Discovery and Data Mining (pp. 1080-1087). Springer, Berlin, Heidelberg.
[22]. Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., & Bengio, Y. (2014). Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078.
[23]. Fu, R., Zhang, Z., & Li, L. (2016, November). Using LSTM and GRU neural network methods for traffic flow prediction. In 2016 31st Youth Academic Annual Conference of Chinese Association of Automation (YAC) (pp. 324-328). IEEE.
[24]. Breiman, L. (2001). Random forests. Machine learning, 45(1), 5-32.
[25]. Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine learning, 20(3), 273-297.
[26]. Singh VP (1988) Hydrologic systems: rainfall-runoff modeling, vol 1. Prentice hall, Englewood Cliffs
[27]. Shen, J. (1965). Use of analog models in the analysis of flood runoff (No. 506). US Government Printing Office.
[28]. Hu, C., Wu, Q., Li, H., Jian, S., Li, N., & Lou, Z. (2018). Deep learning with a long short-term memory networks approach for rainfall-runoff simulation. Water, 10(11), 1543.
[29]. Mekanik, F., Imteaz, M. A., Gato-Trinidad, S., & Elmahdi, A. (2013). Multiple regression and Artificial Neural Network for long-term rainfall forecasting using large scale climate modes. Journal of Hydrology, 503, 11-21.
[30]. Assem, H., Ghariba, S., Makrai, G., Johnston, P., Gill, L., & Pilla, F. (2017, September). Urban water flow and water level prediction based on deep learning. In Joint European conference on machine learning and knowledge discovery in databases (pp. 317-329). Springer, Cham.
[31]. Pham, B. T., Le, L. M., Le, T. T., Bui, K. T. T., Le, V. M., Ly, H. B., & Prakash, I. (2020). Development of advanced artificial intelligence models for daily rainfall prediction. Atmospheric Research, 237, 104845.
[32]. Liu, M., Huang, Y., Li, Z., Tong, B., Liu, Z., Sun, M., ... & Zhang, H. (2020). The applicability of LSTM-KNN model for real-time flood forecasting in different climate zones in China. Water, 12(2), 440.
[33]. Asmel, N. K., Al-Nima, R. R., Mohammed, F. I., Al Saadi, A. M., & Ganiyu, A. A. (2021). Forecasting Effluent Turbidity and pH In Jar Test Using Radial Basis Neural Network. In Towards a Sustainable Water Future: Proceedings of Oman’s International Conference on Water Engineering and Management of Water Resources (pp. 361-370). ICE Publishing.
[34]. Song, C., & Zhang, H. (2020). Study on turbidity prediction method of reservoirs based on long short term memory neural network. Ecological Modelling, 432, 109210.
[35]. Wang, Y., Chen, J., Cai, H., Yu, Q., & Zhou, Z. (2021). Predicting water turbidity in a macro-tidal coastal bay using machine learning approaches. Estuarine, Coastal and Shelf Science, 252, 107276.
[36]. Stevenson, M., & Bravo, C. (2019). Advanced turbidity prediction for operational water supply planning. Decision Support Systems, 119, 72-84.
[37]. Mosavi, A., Ozturk, P., & Chau, K. W. (2018). Flood prediction using machine learning models: Literature review. Water, 10(11), 1536.
[38]. Bergstra, J., & Bengio, Y. (2012). Random search for hyper-parameter optimization. Journal of machine learning research, 13(2).
[39]. Biau, G., & Scornet, E. (2016). A random forest guided tour. Test, 25(2), 197-227.
[40]. Frazier, P. I. (2018). A tutorial on Bayesian optimization. arXiv preprint arXiv:1807.02811.
[41]. Močkus, J. (1975). On Bayesian methods for seeking the extremum. In Optimization techniques IFIP technical conference (pp. 400-404). Springer, Berlin, Heidelberg.
[42]. Zhang, S., Li, X., Zong, M., Zhu, X., & Cheng, D. (2017). Learning k for knn classification. ACM Transactions on Intelligent Systems and Technology (TIST), 8(3), 1-19.
[43]. Lahiri, S. K., & Ghanta, K. C. (2008). The support vector regression with the parameter tuning assisted by a differential evolution technique: Study of the critical velocity of a slurry flow in a pipeline. Chemical Industry and Chemical Engineering Quarterly, 14(3), 191-203.
[44]. Joy, T. T., Rana, S., Gupta, S., & Venkatesh, S. (2019). A flexible transfer learning framework for Bayesian optimization with convergence guarantee. Expert Systems with Applications, 115, 656-672.
[45]. Wikimedia commons user DenisBoigelot. Examples of correlations. In the public domain. (2011). Retireved May 25, 2022, from https://commons.wikimedia.org/wiki/File:Correlation_examples2.svg
[46]. Someka. (2021, November 10). How to Normalize Data in Excel? Retireved May 25, 2022, from https://www.someka.net/blog/how-to-normalize-data-in-excel/
[47]. McCulloch, W. S., & Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. The bulletin of mathematical biophysics, 5(4), 115-133.
[48]. Vapnik, V.N. (1995) The Nature of Statistical Learning Theory. Springer, New York.
[49]. Huang, Q., Mao, J., & Liu, Y. (2012, November). An improved grid search algorithm of SVR parameters optimization. In 2012 IEEE 14th International Conference on Communication Technology (pp. 1022-1026). IEEE.
[50]. Ren-Jun, Z. (1992). The Xinanjiang model applied in China. Journal of hydrology, 135(1-4), 371-381.
[51]. Yao, C., Li, Z. J., Bao, H. J., & Yu, Z. B. (2009). Application of a developed Grid-Xinanjiang model to Chinese watersheds for flood forecasting purpose. Journal of Hydrologic Engineering, 14(9), 923-934.
[52]. Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., & Bengio, Y. (2014). Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078.
[53]. Kennedy, J., & Eberhart, R. (1995, November). Particle swarm optimization. In Proceedings of ICNN′95-international conference on neural networks (Vol. 4, pp. 1942-1948). IEEE.
[54]. Hansen, L. D., Stokholm-Bjerregaard, M., & Durdevic, P. (2022). Modeling phosphorous dynamics in a wastewater treatment process using Bayesian optimized LSTM. Computers & Chemical Engineering, 160, 107738.
[55]. Hinton, G. E., Srivastava, N., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R. R. (2012). Improving neural networks by preventing co-adaptation of feature detectors. arXiv preprint arXiv:1207.0580.
[56]. Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R. (2014). Dropout: a simple way to prevent neural networks from overfitting. The journal of machine learning research, 15(1), 1929-1958.
[57]. Kao, I. F., Zhou, Y., Chang, L. C., & Chang, F. J. (2020). Exploring a Long Short-Term Memory based Encoder-Decoder framework for multi-step-ahead flood forecasting. Journal of Hydrology, 583, 124631. |