Not Logged In

Sequence to Backward and Forward Sequences: A Content-Introducing Approach to Generative Short-Text Conversation

Full Text: C16-1316.pdf PDF

Using neural networks to generate replies in human-computer dialogue systems is attracting increasing attention over the past few years. However, the performance is not satisfactory: the neural network tends to generate safe, universally relevant replies which carry little meaning. In this paper, we propose a content-introducing approach to neural network-based generative dialogue systems. We first use pointwise mutual information (PMI) to predict a noun as a keyword, reflecting the main gist of the reply. We then propose seq2BF, a “sequence to backward and forward sequences” model, which generates a reply containing the given keyword. Experimental results show that our approach significantly outperforms traditional sequence-to-sequence models in terms of human evaluation and the entropy measure, and that the predicted keyword can appear at an appropriate position in the reply.

Citation

L. Mou, Y. Song, R. Yan, G. Li, L. Zhang, Z. Jin. "Sequence to Backward and Forward Sequences: A Content-Introducing Approach to Generative Short-Text Conversation". Conference on Computational Linguistics (COLING), pp 3349–3358, February 2016.

Keywords:  
Category: In Conference
Web Links: ACL

BibTeX

@incollection{Mou+al:COLING16,
  author = {Lili Mou and Yiping Song and Rui Yan and Ge Li and Lu Zhang and Zhi
    Jin},
  title = {Sequence to Backward and Forward Sequences: A Content-Introducing
    Approach to Generative Short-Text Conversation},
  Pages = {3349–3358},
  booktitle = {Conference on Computational Linguistics (COLING)},
  year = 2016,
}

Last Updated: February 03, 2021
Submitted by Sabina P

University of Alberta Logo AICML Logo