Not Logged In

Harnessing Pre-Trained Neural Networks with Rules for Formality Style Transfer

Formality text style transfer plays an important role in various NLP applications, such as non-native speaker assistants and child education. Early studies normalize informal sentences with rules, before statistical and neural models become a prevailing method in the field. While a rule-based system is still a common preprocessing step for formality style transfer in the neural era, it could introduce noise if we use the rules in a naive way such as data preprocessing. To mitigate this problem, we study how to harness rules into a state-of-the-art neural network that is typically pretrained on massive corpora. We propose three fine-tuning methods in this paper and achieve a new state-of-the-art on benchmark datasets

Citation

Y. Wang, Y. Wu, L. Mou, Z. Li, W. Chao. "Harnessing Pre-Trained Neural Networks with Rules for Formality Style Transfer". EMNLP - Conference on Empirical Methods in Natural Language Processing, pp 3573–3578, November 2019.

Keywords:  
Category: In Conference
Web Links: doi
  ACL

BibTeX

@incollection{Wang+al:(EMNLP)19,
  author = {Yunli Wang and Yu Wu and Lili Mou and Zhoujun Li and Wenhan Chao},
  title = {Harnessing Pre-Trained Neural Networks with Rules for Formality
    Style Transfer},
  Pages = {3573–3578},
  booktitle = {EMNLP - Conference on Empirical Methods in Natural Language
    Processing},
  year = 2019,
}

Last Updated: February 02, 2021
Submitted by Sabina P

University of Alberta Logo AICML Logo