參考文獻 |
[1]Zhao Chen, Vijay Badrinarayanan, Chen-Yu Lee, and Andrew Rabinovich.Gradnorm: Gradient normalization for adaptive loss balancing in deep mul-titask networks. InICML, 2018.
[2]Wei-Cheng Chiu. Joint learning of aspect-level sentiment analysis and singernamed recognition from social networks. InInternational Conference on Tech-nologies and Applications of Artificial Intelligence (TAAI), 2020.
[3]Wei-Cheng Chiu. Joint learning of aspect-level sentiment analysis and singernamed recognition from social networks. InInternational Conference on Tech-nologies and Applications of Artificial Intelligence (TAAI), 2020.
[4]Chien-Lung Chou, Chia-Hui Chang, and Ya-Yun Huang. Boosted web namedentity recognition via tri-training.ACM Trans. Asian Low-Resour. Lang. Inf.Process., 16(2), October 2016.
[5]Ronan Collobert, Jason Weston, Léon Bottou, Michael Karlen, KorayKavukcuoglu, and Pavel Kuksa. Natural language processing (almost) fromscratch.J. Mach. Learn. Res., 12(null):24932537, November 2011.
[6]Wenyuan Dai, Qiang Yang, Gui-Rong Xue, and Yong Yu. Boosting for trans-fer learning. InProceedings of the 24th International Conference on MachineLearning, ICML ’07, page 193200, New York, NY, USA, 2007. Association forComputing Machinery.
[7]J. Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. Bert: Pre-training of deep bidirectional transformers for language understanding. InNAACL-HLT, 2019.
[8]Yaroslav Ganin and Victor Lempitsky. Unsupervised domain adaptation bybackpropagation. InProceedings of the 32nd International Conference on In-ternational Conference on Machine Learning - Volume 37, ICML’15, page11801189. JMLR.org, 2015.
[9]Ruidan He, Wee Sun Lee, Hwee Tou Ng, and Daniel Dahlmeier. An interactivemulti-task learning network for end-to-end aspect-based sentiment analysis. InProceedings of the 57th Annual Meeting of the Association for ComputationalLinguistics, pages 504–515, 2019
[10]Mengting Hu, Yike Wu, Shiwan Zhao, Honglei Guo, Renhong Cheng, and ZhongSu. Domain-invariant feature distillation for cross-domain sentiment classifica-tion. InProceedings of the 2019 Conference on Empirical Methods in NaturalLanguage Processing and the 9th International Joint Conference on NaturalLanguage Processing, EMNLP-IJCNLP 2019, Hong Kong, China, November3-7, 2019, pages 5558–5567, 2019.
[11]Zhiheng Huang, Wei Xu, and Kai Yu. Bidirectional lstm-crf models for sequencetagging.CoRR, abs/1508.01991, 2015.
[12]Chen Jia, Xiaobo Liang, and Yue Zhang. Cross-domain NER using cross-domain language modeling. InProceedings of the 57th Annual Meeting of theAssociation for Computational Linguistics, pages 2464–2474. Association forComputational Linguistics, July 2019.
[13]John D. Lafferty, Andrew McCallum, and Fernando C. N. Pereira. Conditionalrandom fields: Probabilistic models for segmenting and labeling sequence data.InProceedings of the Eighteenth International Conference on Machine Learning,ICML ’01, pages 282–289. Morgan Kaufmann Publishers Inc., 2001.
[14]Peng-Hsuan Li, Tsu-Jui Fu, and Wei-Yun Ma. Why attention? analyze bilstmdeficiency and its remedies in the case of ner. InAAAI, 2020.
[15]Zheng Li, Xin Li, Ying Wei, Lidong Bing, Y. Zhang, and Qiang Yang. Trans-ferable end-to-end aspect-based sentiment analysis with selective adversariallearning, 2019.
[16]Shikun Liu, Edward Johns, and A. Davison. End-to-end multi-task learningwith attention.2019 IEEE/CVF Conference on Computer Vision and PatternRecognition (CVPR), pages 1871–1880, 2019.
[17]Andrew McCallum, Dayne Freitag, and Fernando C. N. Pereira. Maximum en-tropy markov models for information extraction and segmentation. InProceed-ings of the Seventeenth International Conference on Machine Learning, ICML’00, page 591598. Morgan Kaufmann Publishers Inc., 2000.
[18]Lilyana Mihalkova, Tuyen Huynh, and Raymond J. Mooney. Mapping and revis-ing markov logic networks for transfer learning. InProceedings of the 22nd Na-tional Conference on Artificial Intelligence - Volume 1, AAAI’07, page 608614.AAAI Press, 2007.
[19]Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey Dean.Distributed representations of words and phrases and their compositionality. InProceedings of the 26th International Conference on Neural Information Pro-cessing Systems - Volume 2, NIPS’13, page 31113119. Curran Associates Inc.,2013.
[20]Sinno Jialin Pan, Ivor W. Tsang, James T. Kwok, and Qiang Yang. Domainadaptation via transfer component analysis.IEEE Transactions on Neural Net-works, pages 199–210, 2011.
[21]Sinno Jialin Pan and Qiang Yang. A survey on transfer learning.IEEE Trans-actions on Knowledge and Data Engineering, 22(10):1345–1359, 2010.
[22]Jeffrey Pennington, Richard Socher, and Christopher D Manning. Glove:Global vectors for word representation. InEMNLP, volume 14, pages 1532–1543, 2014.
[23]Matthew E. Peters, Mark Neumann, Mohit Iyyer, Matt Gardner, ChristopherClark, Kenton Lee, and Luke Zettlemoyer. Deep contextualized word represen-tations, 2018. cite arxiv:1802.05365Comment: NAACL 2018. Originally postedto openreview 27 Oct 2017. v2 updated for NAACL camera ready.
[24]Lawrence R. Rabiner.A Tutorial on Hidden Markov Models and Selected Ap-plications in Speech Recognition, page 267296. Morgan Kaufmann PublishersInc., 1990.
[25]Alec Radford and Ilya Sutskever. Improving language understanding by gener-ative pre-training. Inarxiv, 2018.
[26]Kumar Ravi and V. Ravi. A survey on opinion mining and sentiment analysis:Tasks, approaches and applications.Knowl. Based Syst., 89:14–46, 2015.
[27]K. Schouten and F. Frasincar. Survey on aspect-level sentiment analysis.IEEETransactions on Knowledge and Data Engineering, 28:813–830, 2016.
[28]Jingbo Shang, Liyuan Liu, Xiang Ren, X. Gu, Teng Ren, and Jiawei Han.Learning named entity tagger using domain-specific dictionary. InEMNLP,2018.
[29]Duyu Tang, Bing Qin, Xiaocheng Feng, and Ting Liu. Effective lstms fortarget-dependent sentiment classification. InProceedings of COLING 2016, the26th International Conference on Computational Linguistics: Technical Papers,pages 3298–3307, 2016.
[30]Yequan Wang, Minlie Huang, Xiaoyan Zhu, and Li Zhao. Attention-basedLSTM for aspect-level sentiment classification. InProceedings of the 2016 Con-ference on Empirical Methods in Natural Language Processing, pages 606–615.Association for Computational Linguistics, November 2016.
[31]Hu Xu, Bing Liu, Lei Shu, and Philip S. Yu. Bert post-training for reviewreading comprehension and aspect-based sentiment analysis. InNAACL, 2019.
[32]Wei Xue and Tao Li. Aspect based sentiment analysis with gated convolutionalnetworks. InProceedings of the 56th Annual Meeting of the Association forComputational Linguistics (Volume 1: Long Papers), pages 2514–2523. Associ-ation for Computational Linguistics, July 2018.
[33]Kai Zhang, Hefu Zhang, Qi Liu, Hongke Zhao, Hengshu Zhu, and Enhong Chen.Interactive attention transfer network for cross-domain sentiment classification.Proceedings of the AAAI Conference on Artificial Intelligence, 33(01):5773–5780, Jul. 2019.
[34]Fuzhen Zhuang, Zhiyuan Qi, Keyu Duan, Dongbo Xi, Yongchun Zhu, HengshuZhu, Hui Xiong, and Qing He. A comprehensive survey on transfer learning.Proceedings of the Institute of Radio Engineers, 109(1):43–76, January 2021. |