Not Logged In

Formality Style Transfer with Shared Latent Space

Full Text: 2020.coling-main.203.pdf PDF

Conventional approaches for formality style transfer borrow models from neural machine translation, which typically requires massive parallel data for training. However, the dataset for formality style transfer is considerably smaller than translation corpora. Moreover, we observe that informal and formal sentences closely resemble each other, which is different from the translation task where two languages have different vocabularies and grammars. In this paper, we present a new approach, Sequence-to-Sequence with Shared Latent Space (S2S-SLS), for formality style transfer, where we propose two auxiliary losses and adopt joint training of bi-directional transfer and auto-encoding. Experimental results show that S2S-SLS (with either RNN or Transformer architectures) consistently outperforms baselines in various settings, especially when we have limited data.

Citation

Y. Wang, Y. Wu, L. Mou, Z. Li, W. Chao. "Formality Style Transfer with Shared Latent Space". Conference on Computational Linguistics (COLING), pp 2236-2249, December 2020.

Keywords:  
Category: In Conference
Web Links: doi
  ACL

BibTeX

@incollection{Wang+al:COLING20,
  author = {Yunli Wang and Yu Wu and Lili Mou and Zhoujun Li and WenHan Chao},
  title = {Formality Style Transfer with Shared Latent Space},
  Pages = {2236-2249},
  booktitle = {Conference on Computational Linguistics (COLING)},
  year = 2020,
}

Last Updated: February 01, 2021
Submitted by Sabina P

University of Alberta Logo AICML Logo