02-09
Learning compact models from high dimensional large datasets

[[{"fid":"524","view_mode":"embedded_left","fields":{"format":"embedded_left","field_file_image_alt_text[und][0][value]":"Yoram Singer","field_file_image_title_text[und][0][value]":"","field_file_caption_credit[und][0][value]":"%3Cp%3EYoram%20Singer%3C%2Fp%3E%0A","field_file_caption_credit[und][0][format]":"full_html"},"type":"media","attributes":{"alt":"Yoram Singer","height":331,"width":250,"class":"media-element file-embedded-left"},"link_text":null}]]I review the design, analysis, and implementation of stochastic optimization techniques, online algorithms, and modeling approaches for learning in high dimensional spaces using large amounts of data. The focus is on algorithms and models that are efficient, accurate, and yield compact models. Concretely, the forward-backward shrinkage algorithm (Fobos), mirror descent for learning composite objectives (COMID), and the adaptive gradient (AdaGrad) algorithm. We also discuss simple yet effective modeling approaches based on locality for learning from high dimensional data.

Yoram Singer is a senior research scientist at Google. From 1999 through 2007 he was an associate professor at the Hebrew University of Jerusalem. From 1995 through 1999 he was a member of the technical staff at AT&T Research. He was the co-chair of the conference on Computational Learning Theory (COLT) in 2004 and of Neural Information Processing Systems (NIPS) in 2007. He serves as an editor of the Journal of Machine Learning, IEEE Signal Processing Magazine, and IEEE Trans. on Pattern Analysis and Machine Intelligence.

Date and Time
Monday February 9, 2015 4:30pm - 5:30pm
Location
Computer Science Small Auditorium (Room 105)
Host
Sebastian Seung

Contributions to and/or sponsorship of any event does not constitute departmental or institutional endorsement of the specific program, speakers or views presented.

CS Talks Mailing List