04-06
Deep Semantics from Shallow Supervision

What is the total population of the ten largest capitals in the US? Building a system to answer free-form questions such as this requires modeling the deep semantics of language. But to develop practical, scalable systems, we want to avoid the costly manual annotation of these deep semantic structures and instead learn from just surface-level supervision, e.g., question/answer pairs. To this end, we develop a new tree-based semantic representation which has favorable linguistic and computational properties, along with an algorithm that induces this hidden representation. Using our approach, we obtain significantly higher accuracy on the task of question answering compared to existing state-of-the-art methods, despite using less supervision.

Percy Liang obtained a B.S. (2004) and an M.S. (2005) from MIT and is now completing his Ph.D. at UC Berkeley with Michael Jordan and Dan Klein. The general theme of his research, which spans machine learning and natural language processing, is learning richly-structured statistical models from limited supervision. He has won a best student paper at the International Conference on Machine Learning in 2008, received the NSF, GAANN, and NDSEG fellowships, and is also a 2010 Siebel Scholar.

Date and Time
Wednesday April 6, 2011 4:30pm - 5:30pm
Location
Computer Science Small Auditorium (Room 105)
Speaker
Host
David Blei

Contributions to and/or sponsorship of any event does not constitute departmental or institutional endorsement of the specific program, speakers or views presented.

CS Talks Mailing List