參考文獻 |
[1] Jonathan Baxter. A bayesian/information theoretic model of learning to learn via multipletask sampling. InMachine Learning, pages 7–39, 1997.
[2] Pengfei Cao, Yubo Chen, Kang Liu, Jun Zhao, and Shengping Liu. Adversarial transferlearning for Chinese named entity recognition with self-attention mechanism. InProceed-ings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages182–192, Brussels, Belgium, October-November 2018. Association for Computational Lin-guistics.
[3] Richard Caruana. Multitask learning: A knowledge-based source of inductive bias. InProceedings of the Tenth International Conference on Machine Learning, pages 41–48.Morgan Kaufmann, 1993.
[4] Jason P.C. Chiu and Eric Nichols. Named entity recognition with bidirectional lstm-cnns.Transactions of the Association for Computational Linguistics, 4:357–370, 2016.
[5] Ronan Collobert, Jason Weston, Léon Bottou, Michael Karlen, Koray Kavukcuoglu, andPavel P. Kuksa. Natural language processing (almost) from scratch.CoRR, abs/1103.0398,2011.
[6] Yann N. Dauphin, Angela Fan, Michael Auli, and David Grangier. Language modelingwith gated convolutional networks.CoRR, abs/1612.08083, 2016.
[7] Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. BERT: pre-trainingof deep bidirectional transformers for language understanding.CoRR, abs/1810.04805,2018.
[8] Long Duong, Trevor Cohn, Steven Bird, and Paul Cook. Low resource dependency parsing:Cross-lingual parameter sharing in a neural network parser. InProceedings of the 53rdAnnual Meeting of the Association for Computational Linguistics and the 7th InternationalJoint Conference on Natural Language Processing (Volume 2: Short Papers), pages 845–850, Beijing, China, July 2015. Association for Computational Linguistics.
[9] Yoshua Bengio Dzmitry Bahdanau, Kyunghyun Cho. Neural machine translation by jointlylearning to align and translate.CoRR, 2016.
[10] Ruidan He, Wee Sun Lee, Hwee Tou Ng, and Daniel Dahlmeier. An interactive multi-tasklearning network for end-to-end aspect-based sentiment analysis.CoRR, abs/1906.06906,2019.
[11] Sepp Hochreiter and Jürgen Schmidhuber. Long short-term memory.Neural Computation,9(8):1735–1780, 1997.
[12] Sepp Hochreiter and Jürgen Schmidhuber. Long short-term memory.Neural Computation,9(8):1735–1780, 1997.
[13] Binxuan Huang, Yanglan Ou, and Kathleen M. Carley. Aspect level sentiment classificationwith attention-over-attention neural networks.CoRR, abs/1804.06536, 2018.
[14] Lun-Wei Ku and Hsin-Hsi Chen. Mining opinions from the web: Beyond relevance retrieval.Journal of the American Society for Information Science and Technology, 58(12):1838–1850, 2007.
[15]Guillaume Lample, Miguel Ballesteros, Sandeep Subramanian, Kazuya Kawakami, andChris Dyer. Neural architectures for named entity recognition.CoRR, abs/1603.01360,2016.
[16] Peng-Hsuan Li, Tsu-Jui Fu, and Wei-Yun Ma. Why attention? analyze bilstm deficiencyand its remedies in the case of ner, 2019.
[17]Mei-Juen Liu and Hui-Li Xu. (the processing of Chinese verbs: a comparison of the CKIPclassification and the Chinese verb dictionary) [in Chinese]. InProceedings of RoclingVII Computational Linguistics Conference VII, pages 91–110, Hsinchu, Taiwan, August1994. The Association for Computational Linguistics and Chinese Language Processing(ACLCLP).
[18]Volodymyr Mnih, Nicolas Heess, Alex Graves, and koray kavukcuoglu. Recurrent modelsof visual attention. In Z. Ghahramani, M. Welling, C. Cortes, N. D. Lawrence, and K. Q.Weinberger, editors,Advances in Neural Information Processing Systems 27, pages 2204–2212. Curran Associates, Inc., 2014.
[19]Jingbo Shang, Jialu Liu, Meng Jiang, Xiang Ren, Clare R. Voss, and Jiawei Han. Auto-mated phrase mining from massive text corpora.CoRR, abs/1702.04457, 2017.
[20]Jingbo Shang, Liyuan Liu, Xiang Ren, Xiaotao Gu, Teng Ren, and Jiawei Han. Learningnamed entity tagger using domain-specific dictionary.CoRR, abs/1809.03599, 2018.
[21]Rupesh Kumar Srivastava, Klaus Greff, and Jürgen Schmidhuber. Highway networks.CoRR, abs/1505.00387, 2015.
[22]Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N.Gomez, Lukasz Kaiser, and Illia Polosukhin. Attention is all you need.CoRR,abs/1706.03762, 2017.
[23]Fangzhao Wu, Junxin Liu, Chuhan Wu, Yongfeng Huang, and Xing Xie. Neural chinesenamed entity recognition via CNN-LSTM-CRF and joint training with word segmentation.CoRR, abs/1905.01964, 2019.
[24]Wei Xue and Tao Li. Aspect based sentiment analysis with gated convolutional networks.InProceedings of the 56th Annual Meeting of the Association for Computational Linguistics(Volume 1: Long Papers), pages 2514–2523, Melbourne, Australia, July 2018. Associationfor Computational Linguistics.
[25]Yongxin Yang and Timothy M. Hospedales. Trace norm regularised deep multi-task learn-ing.CoRR, abs/1606.04038, 2016.
[26]Da Yin, Xiao Liu, and Xiaojun Wan. Interactive multi-grained joint model for targetedsentiment analysis. InCIKM ’19, 2019.
[27]Da Yin, Xiao Liu, and Xiaojun Wan. Interactive multi-grained joint model for targetedsentiment analysis. InProceedings of the 28th ACM International Conference on Informa-tion and Knowledge Management, CIKM ’19, page 1031–1040, New York, NY, USA, 2019.Association for Computing Machinery.
[28]Yue Zhang and Jie Yang. Chinese ner using lattice lstm. 2018.
[29]黎桂如and Gui-Ru Li.應用歌手辨識及角色標注於輿情意見目標分析之研究. 2019. |