BYU CS Logo
Computing That Serves

Flexible Machine Learning for Hard Language Problems

Date: 

Thursday, September 14, 2006 - 11:00am

Speaker: 

Hal Daume, University of Utah

Solving computationally hard problems, such as those commonly encountered in natural language processing and computational biology, often requires that approximate search methods be used to produce a structured output (eg., machine translation, speech recognition, protein folding). Unfortunately, this fact is rarely taken into account when machine learning methods are conceived and employed. This leads to complex algorithms with few theoretical guarantees about performance on unseen test data. I present a machine learning approach that directly solves "structured prediction" problems by considering formal techniques that reduce structured prediction to simple binary classification, within the context of search. This reduction is error-limiting: it provides theoretical guarantees about the performance of the structured prediction model on unseen test data. It also lends itself to novel training methods for structured prediction models, yielding efficient learning algorithms that perform well in practice. I empirically evaluate this approach in the context of two tasks: entity detection and tracking and automatic document summarization.

Biography: 

Hal Daume III is an Assistant Professor of Computer Science at the University of Utah. He received his Ph.D. in Computer Science at the University of Southern California in 2006 and received his Bachelor's degree from the Mathematical Sciences department at Carnegie Mellon University in 2001. His interests are in developing and applying advanced machine learning techniques to problems that exhibit complex structure, such as those found in natural language. He has successfully applied variational inference and expectation propagation techniques to unsupervised learning problems. He has also applied nonparametric infinite Bayesian models to problems in supervised clustering.




Academics