Exponential weighting techniques have had a big impact on learning, in theory and practice. First, they provide a general theoretical approach for online (adaptive) learning, where one must adapt to an unknown future. Unfortunately, these weighting techniques are inefficient and impractical for large problems. We present a single approach that is simple, efficient and near-optimal for a wide range of online problems.
Next, we discuss the problem of boosting, where one tries to improve the accuracy of any learning procedure. Here exponential weighting techniques have been consistently the most popular method and are an excellent example of a theoretical idea having a big impact in practice. The
widely-recognized downside is that these boosting algorithms perform poorly with noisy data. We show that lesser-known boosting techniques, namely those based on decision graphs, are noise tolerant and can be made to have the same optimal efficiency rates.
Date and Time
Wednesday March 26, 2003 4:00pm -
5:30pm
Location
Computer Science Small Auditorium (Room 105)
Event Type
Speaker
Adam Kalai, from MIT
Host
Amit Sahai
Website