參考文獻 |
1. Ang, J. C., Mirzal, A., Haron, H., Hamed, H. N. A. (2016). Supervised, unsupervised, and semi-supervised feature selection: a review on gene selection. IEEE/ACM Transactions on Computational Biology and Bioinformatics, Vol. 13, no. 5, pp. 971–989.
2. Cachada, A., Barbosa, J., Leitño, P., Geraldcs, C., Deusdado, L., Costa, J., Teixeira, C., Teixeira, J., Moreira, A., Moreira, P., Romero, L. (2018). Maintenance 4.0: Intelligent and Predictive Maintenance System Architecture. IEEE 23rd International Conference on Emerging Technologies and Factory Automation (ETFA), Vol. 1, pp. 139–146.
3. Chandola, V., Banerjee, A., Kumar, V. (2009). Anomaly detection: A survey,” ACM computing surveys., Vol. 41, no. 3, pp. 15:1–15:58.
4. Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., Bengio, Y. (2014). Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. Proceedings of the Empiricial Methods in Natural Language Processing (EMNLP 2014).
5. Davis, J., Goadrich, M. (2006). The relationship between Precision-Recall and ROC curves. Proceedings of the 23rd international conference on Machine learning, pp. 233–240.
6. Deng, L. Yu, D. (2014). Deep Learning: Methods and Applications. Foundations and Trends in Signal Processing, Vol. 7, no. 3–4, pp. 197–387.
7. Di Persio, L., Honchar, O. (2017). Recurrent neural networks approach to the financial forecast of google assets. International Journal of Mathematics and Computers in Simulation, Vol. 11, pp. 7–13.
8. Ergen, T., Mirza, A. H., Kozat, S. S. (2017). Unsupervised and Semi-supervised Anomaly Detection with LSTM Neural Networks. Vol. 1, arXiv:1710.09207.
9. Fu, R., Zhang, Z., Li, L. (2016). Using LSTM and GRU Neural Network Methods for Traffic Flow Prediction. IEEE Youth Academic Annual Conference of Chinese Association of Automation (YAC), pp. 324–328.
10. Goodfellow, I., Bengio, Y., Courville, A. (2016). Deep Learning. MIT Press.
11. Graves, A. (2014). Generating sequences with recurrent neural networks.
arXiv:1308.0850v5.
12. Guo, Y., Liao, W., Wang, Q, Yu, L., Ji, T., Li, P. (2018). Multidimensional Time Series Anomaly Detection: A GRU-based Gaussian Mixture Variational Autoencoder Approach. Asian Conference on Machine Learning, pp. 97–112.
13. Hochreiter, S., Schmidhuber, J. (1997). Long Short-Term Memory. Neural Computation, Vol. 9, pp. 1735–1780.
14. Jiao, R., Zhang, T., Jiang, Y., He, H. (2018). Short-Term Non-Residential Load Forecasting Based on Multiple Sequences LSTM Recurrent Neural Network. IEEE Access, Vol. 6, pp. 59438–59448.
15. Kingma, P. D., Ba, J. (2014). Adam: A Method for Stochastic Optimization. International Conference on Learning Representations.
16. Lasi, H., Fettke, P., Kemper, H. G., Feld, T., Hoffmanne, M., (2014). Industry 4.0. Business & Information Systems Engineering, Vol. 6, pp. 239.
17. LeCun, Y., Bottou L., Bengio Y., Haffner P., (1998). Gradient-Based Learning Applied to Document Recognition. Proceedings of the IEEE, Vol. 86, no. 11, pp. 2278–2324.
18. Li, S., Xie, Y., Farajtabar, M., Song, L. (2016). Detecting weak changes in dynamic events over networks. arXiv:1603.08981v2.
19. Malhotra, P., Vig, L., Shroff, G., Agarwal, P. (2015). Long Short Term Memory
Networks for Anomaly Detection in Time Series. European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Vol. 1, pp. 89–94.
20. Nielsen, M. (2015). Neural Networks and Deep Learning. Determination Press.
21. Nwankpa, C. E., Ijomah, W., Gachagan, A., Marshall, S. (2018). Activation Functions: Comparison of Trends in Practice and Research for Deep Learning. arXiv:1811.03378v1.
22. Olah, C., Understanding LSTM networks. (2015).
Available from < http://colah.github.io/posts/2015-08-Understanding-LSTMs/ >
23. Pascanu, R., Mikolov, T., Bengio, Y. (2013) On the difficulty of training recurrent neural networks. Proceedings of International Conference on Machine Learning (ICML), pp. 1310–1318.
24. Raúl Gómez blog. (2018). Understanding Categorical Cross-Entropy Loss, Binary Cross-Entropy Loss, Softmax Loss, Logistic Loss, Focal Loss and all those confusing names.
Available from < https://gombru.github.io/2018/05/23/cross_entropy_loss/ >
25. Stojanovic, L., Dinic, M., Stojanovic, N., Stojadinovic, A. (2016). Big-data- driven anomaly detection in industry (4.0): an approach and a case study. IEEE International Conference on Big Data, pp. 1647–1652.
26. Sutskever, I., Martens, J., Dahl, G., Hinton, G. (2013). On the importance of initialization and momentum in deep learning. Proceedings of the 30th International Conference on Machine Learning (ICML-13), Vol. 28, pp. 1139– 1147.
27. Tieleman, T., Hinton, G. (2012). Lecture 6.5 - rmsprop: Divide the gradient by a running average of its recent magnitude. COURSERA: Neural networks for machine learning, Vol. 4, no. 2, pp. 26–31.
28. Zhang, A., Lipton, Z. C., Li, M. and Smola, A. J. (2020). Dive into Deep Learning. Available from < https://d2l.ai/ >
29. Zhao, H., Sun, S., Jin, B. (2018). Sequential Fault Diagnosis based on LSTM
Neural Network. IEEE Access, Vol. 6, pp. 12929–12939. 45 |