參考文獻 |
[1] https://zh.wikipedia.org/wiki/语义角色标注
[2] https://zh.wikipedia.org/wiki/謂語
[3] https://zhuanlan.zhihu.com/p/48508221
[4] https://blog.csdn.net/mingzai624/article/details/78061506
[5] http://treebank.sinica.edu.tw/
[6] http://ltp.ai/
[7] https://github.com/fxsjy/jieba
[8] https://propbank.github.io/
[9] Bahdanau D., Cho K., Bengio Y.: Neural machine translation by jointly learning to align and translate. The Third International Conference on Learning Representations (2015)
[10] Chang, C, H., Chang., C, H.: Multi-Stack Convolution with Gating Mechanism for Chinese Named Entity Recognition (2018)
[11] Chou, C, L., Chang, C, H.: Named entity extraction via automatic labeling and tri-training: comparison of selection methods. Information Retrieval Technology. AIRS 2014. Lecture Notes in Computer Science, vol 8870. Springer, Cham (2014)
[12] Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014)
[13] Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P.: “Natural Language Processing (Almost) from Scratch”. Journal of Machine Learning Research, pp.2493-2537 (2011)
[14] CRF++: Yet Another CRF toolkit: http://crfpp.sourceforge.net/
[15] Devlin, J., Chang, M, W., Lee, K., Toutanova, K.: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv preprint arXiv:1810.04805 (2018)
[16] Dauphin, Y, N., Fan, A., Auli, M., Grangier, D.: Language modeling with gated convolutional networks. arXiv preprint arXiv: 1612.08083 (2016)
[17] Gal, Y., Ghahramani, Z.: A theoretically grounded application of dropout in recurrent neural networks. Thirtieth Conference on Neural Information Processing Systems (2016)
[18] He, K., Zhang, X., Ren, S., Sun, J.: Deep Residual Learning for Image Recognition. Proceedings of the IEEE conference on computer vision and pattern recognition (2016)
[19] He, L., Lee, K., Levy, O., Zettlemoyer, L.: Jointly predicting predicates and arguments in neural semantic role labeling. In The 56th Annual Meeting of the Association for Computational Linguistics (2018)
[20] Huang, Z., Xu, W., Yu, K.: Bidirectional LSTM-CRF Models for Sequence Tagging. arXiv Preprint.arXiv: 1508.01991 (2015)
[21] He, L., Lee, K., Lewis, M., Zettlemoyer, L. 2017. Deep semantic role labeling: What works and what’s next. In The 55th Annual Meeting of the Association for Computational Linguistics (2017)
[22] Hochreiter, S., Schmidhuber, J.: Long Short-Term Memory. In Neural Computation 9(8):1735-80 (1997)
[23] Kim, S, M., Hovy, E.: Extracting opinions, opinion holders, and topics expressed in online news media text. Proceedings of the Workshop on Sentiment and Subjectivity in Text (2006)
[24] Lafferty, J., Mccallum, A., Pereira, F, C, N.: Conditional random fields: Probabilistic models for segmenting and labeling sequence data. Proceedings of the 18th International Conference on Machine Learning (2011)
[25] Lee, K., He, L., Lewis, M., Zettlemoyer, L.: End-to-end neural coreference resolution. In 2018 Conference on Empirical Methods in Natural Language Processing (2018)
[26] Ma X, Hovy E. End-to-End Sequence Labeling via Bi-directional LSTM-CNNs-CRF. arXiv Preprint. arXiv: 1603.01354 (2016)
[27] McCallum, A., Freitag, D., Pereira, F.: Maximum Entropy Markov Models for Information Extraction and Segmentation. The Seventeenth International Conference on Machine Learning (2000)
[28] Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013)
[29] Mnih, V., Heess, N., Graves, A., Kavukcuoglu, K.: Recurrent Models of Visual Attention. Advances in Neural Information Processing Systems 27 (2014)
[30] Punyakanok, V., Roth, D., Yih, W, T.: The importance of syntactic parsing and inference in semantic role labeling. Computational Linguistics (2008)
[31] Rabiner, L, R.: A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition. Proceedings of the IEEE, vol. 77, No. 2 (1989)
[32] Srivastava, R, K., Greff, K., Schmidhuber, J.: Highway Networks. International Conference on Machine Learning Deep Learning workshop (2015)
[33] Srivastava, R, K., Greff, K., Schmidhuber, J.: Training very deep networks. In Advances in neural information processing systems (2015)
[34] Vaswani, A., Shazeer, M., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A, N., Kaiser, K., Polosukhin, I.: Attention Is All You Need. arXiv preprint arXiv:1706.03762 (2017)
[35] Wang, J, H., Ye, T, W.: Microblog sentiment analysis based on opinion target modifying (2013)
[36] Yao, H., Li, M., Cheng, J.: Extraction of Chinese "Opinion target - Opinion word" Pairs Based on Part-of-speech Rules and Semantic Dependency Parsing. Proceedings of the 2nd International Conference on Business and Information Management, pages 11-14 (2018)
[37] Zhang, Y., Chen, G., Yu, D., Yao, K., Khudanpur, S., Glass, J.: Highway long short-term memory rnns for distant speech recognition. 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (2016)
[38] Zhou, J., Xu, W.: End-to-end learning of semantic role labeling using recurrent neural networks. In The 53rd Annual Meeting of the Association for Computational Linguistics (2015)
[39] 古倫維、陳信希: 中文意見分析之概況、技術與應用。計算語言學學會通訊,第二十卷第五期 2009. |