Not Logged In

A Grammar-Based Structural CNN Decoder for Code Generation

Code generation maps a program description to executable source code in a programming language. Existing approaches mainly rely on a recurrent neural network (RNN) as the decoder. However, we find that a program contains significantly more tokens than a natural language sentence, and thus it may be inappropriate for RNN to capture such a long sequence. In this paper, we propose a grammar-based structural convolutional neural network (CNN) for code generation. Our model generates a program by predicting the grammar rules of the programming language; we design several CNN modules, including the tree-based convolution and pre-order convolution, whose information is further aggregated by dedicated attentive pooling layers. Experimental results on the HearthStone benchmark dataset show that our CNN code generator significantly outperforms the previous state-of-the-art method by 5 percentage points; additional experiments on several semantic parsing tasks demonstrate the robustness of our model. We also conduct in-depth ablation test to better understand each component of our model.

Citation

Z. Sun, Q. Zhu, L. Mou, Y. Xiong, G. Li, L. Zhang. "A Grammar-Based Structural CNN Decoder for Code Generation". National Conference on Artificial Intelligence (AAAI), pp 7055-7062, January 2019.

Keywords:  
Category: In Conference
Web Links: doi
  AAAI

BibTeX

@incollection{Sun+al:AAAI19,
  author = {Zeyu Sun and Qihao Zhu and Lili Mou and Yingfei Xiong and Ge Li and
    Lu Zhang},
  title = {A Grammar-Based Structural CNN Decoder for Code Generation},
  Pages = {7055-7062},
  booktitle = {National Conference on Artificial Intelligence (AAAI)},
  year = 2019,
}

Last Updated: February 02, 2021
Submitted by Sabina P

University of Alberta Logo AICML Logo