Boosting is a general method for producing a very accurate classification rule by combining rough and moderately inaccurate "rules of thumb". While rooted in a theoretical framework of machine learning, boosting has been found to perform quite well empirically. In this talk, I will introduce the boosting algorithm AdaBoost, and explain the underlying theory of boosting, including our explanation of why boosting often does not suffer from overfitting. I will also describe some recent applications and extensions of boosting.
Date and Time
Wednesday February 13, 2002 4:00pm -
5:30pm
Location
Computer Science Small Auditorium (Room 105)
Event Type
Speaker
Robert Schapire, from AT&T Labs
Host
Bernard Chazelle