Definition

The AdaBoost (Adaptive Boosting) is the weighted sum of weak learners that are robust to outliers and noise.

Algorithm

Consider a 2-class problem with

  1. Initialize the weights
  2. Repeat for :
    1. Fit a weak classifier that minimizes the weighted error rate and call the fitted classifier , and its corresponding error rate
    2. Compute
    3. Update the weights by
  3. Output the classifier

Facts

AdaBoost is equivalent to Gradient Boosting using the exponential loss