Dan Goldwasser, Vivek Srikumar and Dan Roth
NAACL HLT 2012 Tutorials 2012.

Abstract

Making decisions in natural language processing problems often involves assigning values to sets of interdependent variables where the expressive dependency structure can influence, or even dictate what assignments are possible. This setting includes a broad range of structured prediction problems such as semantic role labeling, named entity and relation recognition, co-reference resolution, dependency parsing and semantic parsing. The setting is also appropriate for cases that may require making global decisions that involve multiple components, possibly pre-designed or pre- learned, as in summarization, paraphrasing, textual entailment and question answering. In all these cases, it is natural to formulate the decision problem as a constrained optimization problem, with an objective function that is composed of learned models, subject to domain or problem specific constraints.

Constrained Conditional Models (CCM) formulation of NLP problems (also known as: Integer Linear Programming for NLP) is a learning and inference framework that augments the learning of conditional (probabilistic or discriminative) models with declarative constraints (written, for example, using a first-order representation). The key advantage of the CCM formulation is its support for making decisions in an expressive output space while maintaining modularity and tractability of training and inference. In most applications of this framework in NLP, following [Roth & Yih, CoNLL'04], integer linear programming (ILP) has been used as the inference framework, although other algorithms can be used.

This framework has attracted much attention within the NLP community over the last few years, with multiple papers in all the recent major conferences. Formulating structured prediction as a constrained optimization problem over the output of learned models has several advantages. It allows the incorporation of problem specific global constraints using a first order language thus freeing the developer from (much of the) low level feature engineering, and guarantees exact inference. Importantly, it provides also the freedom of decoupling model generation (learning) from the constrained inference stage, often simplifying the learning stage as well as the engineering aspect of building an NLP system, while improving the quality of the solutions. These advantages and the availability of off-the-shelf solvers have led to a large variety of NLP tasks being formulated within it, including semantic role labeling, syntactic parsing, co-reference resolution, summarization, transliteration and joint information extraction.

The goal of this tutorial is to introduce the framework of Constrained Conditional Models to the broader ACL community, motivate it as a generic framework for structured inference in global NLP decision problems, present some of the key theoretical and practical issues involved in using CCMs and survey some of the existing applications of it as a way to promote further development of the framework and additional applications. The tutorial will be useful for senior and junior researchers who are interested in structured prediction and global decision problems in NLP, providing a concise overview of recent perspectives and research result.

Links

Bib Entry

 
  @InProceedings{naaclhlt2012t08,
  title = 	 {Predicting Structures in NLP: Constrained Conditional Models and Integer Linear Programming in NLP},
  author = 	 { Goldwasser, Dan and Srikumar, Vivek and Roth, Dan },
  booktitle = {NAACL HLT 2012 Tutorial Abstracts},
  month     = {June},
  year      = {2012},
  address   = {Montr{\'e}al, Canada},
  publisher = {Association for Computational Linguistics},
}