Not Logged In

Improved Large Margin Dependency Parsing Via Local Constraints and Laplacian Regularization

Full Text: large margin.pdf PDF

We present an improved approach for learning dependency parsers from treebank data. Our technique is based on two ideas for improving large margin training in the context of dependency parsing. First, we incorporate local constraints that enforce the correctness of each individual link, rather than just scoring the global parse tree. Second, to cope with sparse data, we smooth the lexical parameters according to their underlying word similarities using Laplacian Regularization. To demonstrate the benets of our approach, we consider the problem of parsing Chinese treebank data using only lexical features, that is, without part-of-speech tags or grammatical categories. We achieve state of the art performance, improving upon current large margin approaches.

Citation

Q. Wang, C. Cherry, D. Lizotte, D. Schuurmans. "Improved Large Margin Dependency Parsing Via Local Constraints and Laplacian Regularization". Computational Natural Language Learning (CONLL), June 2006.

Keywords: margin, dependency, Laplacian, regularization, machine learning
Category: In Conference

BibTeX

@incollection{Wang+al:CONLL06,
  author = {Qin Iris Wang and Colin Cherry and Dan Lizotte and Dale Schuurmans},
  title = {Improved Large Margin Dependency Parsing Via Local Constraints and
    Laplacian Regularization},
  booktitle = {Computational Natural Language Learning (CONLL)},
  year = 2006,
}

Last Updated: March 12, 2007
Submitted by AICML Admin Assistant

University of Alberta Logo AICML Logo