參考文獻 |
Bahdanau, D., Cho, K., Bengio, Y., 2014. Neural Machine Translation by Jointly Learning to Align and Translate. ArXiv14090473 Cs Stat.
Bottou, L., 2010. Large-Scale Machine Learning with Stochastic Gradient Descent. Proc. COMPSTAT p177-186.
Cheng, J., Dong, L., Lapata, M., 2016. Long Short-Term Memory-Networks for Machine Reading. ArXiv160106733 Cs.
Cho, K., van Merrienboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., Bengio, Y., 2014. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. ArXiv14061078 Cs Stat.
Cornell Movie-Dialogs Corpus [WWW Document], 2019. URL https://www.cs.cornell.edu/~cristian/Cornell_Movie-Dialogs_Corpus.html (accessed 3.29.19).
Dai, Z., Yang, Z., Yang, Y., Carbonell, J., Le, Q. V., & Salakhutdinov, R. 2019. Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context. ArXiv:1901.02860 Cs Stat.
Danescu-Niculescu-Mizil, C., Lee, L., 2010. Chameleons in Imagined Conversations: A New Approach to Understanding Coordination of Linguistic Style in Dialogs. Association for Computational Linguistics.
Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. 2018. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. ArXiv:1810.04805 Cs.
Dong, J., Huang, J., 2018. Enhance word representation for out-of-vocabulary on Ubuntu dialogue corpus. ArXiv180202614 Cs.
Edunov, S., Ott, M., Auli, M., Grangier, D., 2018. Understanding Back-Translation at Scale. ArXiv180809381 Cs.
Gehring, J., Auli, M., Grangier, D., Yarats, D., Dauphin, Y.N., 2017. Convolutional Sequence to Sequence Learning, in: International Conference on Machine Learning.
Ghazvininejad, M., Brockett, C., Chang, M.-W., Dolan, B., Gao, J., Yih, W., Galley, M., 2018. A Knowledge-Grounded Neural Conversation Model, in: Thirty-Second AAAI Conference on Artificial Intelligenc.
Graves, A., Wayne, G., Danihelka, I., 2014. Neural Turing Machines. ArXiv14105401 Cs.
Hahnloser, R.H.R., Sarpeshkar, R., Mahowald, M.A., Douglas, R.J., Seung, H.S., 2000. Digital selection and analogue amplification coexist in a cortex-inspired silicon circuit. Nature 405, 947–951.
He, K., Zhang, X., Ren, S., Sun, J., 2016. Deep Residual Learning for Image Recognition. Presented at the Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778.
Hochreiter, S., 1998. The Vanishing Gradient Problem During Learning Recurrent Neural Nets and Problem Solutions. Int. J. Uncertain. Fuzziness Knowl.-Based Syst. 06, 107–116.
Hochreiter, S., Schmidhuber, J., 1997. Long Short-Term Memory [WWW Document]. URL https://www.mitpressjournals.org/doi/abs/10.1162/neco.1997.9.8.1735 (accessed 3.18.19).
Kai Lempinen, 2017. What are Chatbots and how they impact Service Management. URL http://www.lempinenpartners.com/what-are-chatbots-and-how-they-impact-service-management/ (accessed 3.28.19).
Konstas, I., Iyer, S., Yatskar, M., Choi, Y., Zettlemoyer, L., 2017. Neural AMR: Sequence-to-Sequence Models for Parsing and Generation. ArXiv170408381 Cs.
Li, J.; Galley, M.; Brockett, C.; Spithourakis, G.; Gao, J.; and Dolan, W. B. 2016a. A persona-based neural conversation model. In: ACL, pp. 994–1003.
Li, J., Monroe, W., Ritter, A., Jurafsky, D., Galley, M., Gao, J., 2016b. Deep Reinforcement Learning for Dialogue Generation, in: Proceedings of the Conference on Empirical Methods in Natural Language Processing. pp. 1192–1202.
Lison, P., Tiedemann, J., 2016. OpenSubtitles2016: Extracting Large Parallel Corpora from Movie and TV Subtitles 7.
Liu, C.-W., Lowe, R., Serban, I., Noseworthy, M., Charlin, L., Pineau, J., 2016. How NOT To Evaluate Your Dialogue System: An Empirical Study of Unsupervised Evaluation Metrics for Dialogue Response Generation. Presented at the Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing.
Liu, Y., Bi, W., Liu, X., Shi, S., Zhang, H., 2018. Rethinking Neural Dialogue Generation: A Practical Guide. ResearchGate.
Lowe, R., Noseworthy, M., Serban, I.V., Angelard-Gontier, N., Bengio, Y., Pineau, J., 2017. Towards an Automatic Turing Test: Learning to Evaluate Dialogue Responses, in: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. pp. 1116–1126.
Lowe, R., Pow, N., Serban, I., Pineau, J., 2015. The Ubuntu Dialogue Corpus: A Large Dataset for Research in Unstructured Multi-Turn Dialogue Systems, in: Proceedings of the 16th Annual Meeting of the Special Interest Group on Discourse and Dialogue. Association for Computational Linguistics.
Luong, M.-T., Pham, H., Manning, C.D., 2015. Effective Approaches to Attention-based Neural Machine Translation. ArXiv150804025 Cs.
Luong, T., Pham, H., Manning, C.D., 2015a. Effective Approaches to Attention-based Neural Machine Translation. Association for Computational Linguistics, Lisbon, Portugal.
Luong, T., Sutskever, I., Le, Q., Vinyals, O., Zaremba, W., 2015b. Addressing the Rare Word Problem in Neural Machine Translation. Association for Computational Linguistics.
Manning, C.D., Schiitze, H., 1999. Foundations of Statistical Natural Language Processing 704.
Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J., 2013. Distributed Representations of Words and Phrases and their Compositionality, in: Burges, C.J.C., Bottou, L., Welling, M., Ghahramani, Z., Weinberger, K.Q. (Eds.), Advances in Neural Information Processing Systems 26. Curran Associates, Inc.
Nallapati, R., Zhou, B., dos Santos, C., Gulcehre, C., Xiang, B., 2016. Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond, in: Proceedings of The 20th SIGNLL Conference on Computational Natural Language Learning. Association for Computational Linguistics.
OpenSubtitles [WWW Document], 2019. URL http://opus.nlpl.eu/OpenSubtitles.php (accessed 3.29.19).
Ott, M., Edunov, S., Grangier, D., Auli, M., 2018. Scaling Neural Machine Translation. Association for Computational Linguistics.
Papineni, K., Roukos, S., Ward, T., Zhu, W.-J., 2002. BLEU: a Method for Automatic Evaluation of Machine Translation. Association for Computational Linguistics.
Rumelhart, D.E., Hinton, G.E., Williams, R.J., 1988. Neurocomputing: Foundations of Research, in: Anderson, J.A., Rosenfeld, E. (Eds.), . MIT Press, pp. 696–699.
Schuster, M., Paliwal, K.K., 1997. Bidirectional recurrent neural networks. IEEE Trans. Signal Process. 45.
Serban, I.V., Sordoni, A., Bengio, Y., Courville, A., Pineau, J., 2016. Building End-To-End Dialogue Systems Using Generative Hierarchical Neural Network Models. AAAI6.
Serban, I.V., Sordoni, A., Lowe, R., Charlin, L., Pineau, J., Courville, A., Bengio, Y., 2017. A Hierarchical Latent Variable Encoder-Decoder Model for Generating Dialogues. AAAI 7.
Song, Y., Yan, R., Li, X., Zhao, D., Zhang, M., 2016. Two are Better than One: An Ensemble of Retrieval- and Generation-Based Dialog Systems. ArXiv161007149 Cs.
Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R., 2014. Dropout: A Simple Way to Prevent Neural Networks from Overfitting. J. Mach. Learn. Res. 30.
Sutskever, I., Vinyals, O., Le, Q.V., 2014. Sequence to Sequence Learning with Neural Networks, in: Ghahramani, Z., Welling, M., Cortes, C., Lawrence, N.D., Weinberger, K.Q. (Eds.), Advances in Neural Information Processing Systems. Curran Associates, Inc., pp. 3104–3112.
Tang, G., Müller, M., Rios, A., Sennrich, R., 2018. Why Self-Attention? A Targeted Evaluation of Neural Machine Translation Architectures, in: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, Brussels, Belgium, pp. 4263–4272.
Tao, C., Gao, S., Shang, M., Wu, W., Zhao, D., Yan, 2018a. Get the point of my utterance! learning towards effective responses with multi-head attention mechanism. In: IJCAI 2018
Tao, C., Mou, L., Zhao, D., Yan, R., 2018b. RUBER: An Unsupervised Method for Automatic Evaluation of Open-Domain Dialog Systems, In: AAAI 2018, pp. 722–729
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł., Polosukhin, I., 2017. Attention is All you Need, Advances in Neural Information Processing Systems 30. Curran Associates, Inc., pp. 5998–6008.
Vinyals, O., Le, Q., 2015. A Neural Conversational Model. ArXiv150605869 Cs.
Weng, L., 2018. Attention? Attention! [WWW Document]. URL https://lilianweng.github.io/lil-log/2018/06/24/attention-attention.html#summary (accessed 3.19.19).
Xing, C., Wu, Y., Wu, W., Huang, Y., Zhou, M., 2018. Hierarchical Recurrent Attention Network for Response Generation, in: Thirty-Second AAAI Conference on Artificial Intelligence. Presented at the Thirty-Second AAAI Conference on Artificial Intelligence, AAAI.
Xu, K., Lei, J., Kiros, R., Cho, K., Courville, A., Salakhutdinov, R., Zemel, R.S., Bengio, Y., 2015. Show, Attend and Tell: Neural Image CaptionGeneration with Visual Attention 10.
Yao, K., Zweig, G., Peng, B., 2015. Attention with Intention for a Neural Network Conversation Model. ArXiv151008565 Cs.
Zhao, T., Lu, A., Lee, K., Eskenazi, M., 2017. Generative Encoder-Decoder Models for Task-Oriented Spoken Dialog Systems with Chatting Capability. ArXiv170608476 Cs.
|