Not Logged In

Combining Statistical Language Models Via the Latent Maximum Entropy Principle

In this paper, we present a unified probabilistic framework for statistical language modeling which can simultaneously incorporate various aspects of natural language, such as local word interaction, syntactic structure and semantic document information. Our approach is based on a recent statistical inference principle we have proposed---the latent maximum entropy principle---which allows relationships over hidden features to be e

Citation

S. Wang, D. Schuurmans, F. Peng, Y. Zhao. "Combining Statistical Language Models Via the Latent Maximum Entropy Principle". Machine Learning Journal (MLJ), 60(1-3), pp 229-250, September 2005.

Keywords: language modeling, NĀ­gram models, latent semantic analysis, machine learning
Category: In Journal

BibTeX

@article{Wang+al:MLJ05,
  author = {Shaojun Wang and Dale Schuurmans and Fuchun Peng and Yunxin Zhao},
  title = {Combining Statistical Language Models Via the Latent Maximum Entropy
    Principle},
  Volume = "60",
  Number = "1-3",
  Pages = {229-250},
  journal = {Machine Learning Journal (MLJ)},
  year = 2005,
}

Last Updated: June 06, 2007
Submitted by Nelson Loyola

University of Alberta Logo AICML Logo