Boosting ******** .. module:: ailib.fitting .. contents: .. _boosting-ada: AdaBoost ======== Given :math:`N` training data points and :math:`M` classifiers. The AdaBoost algorithm does the following: 1. Initialize weights :math:`w_i = \frac{1}{N}` 2. For :math:`m=1..M`: a) Fit the classifier :math:`G_m` to the training data using weights :math:`w_i` b) Compute the classifier error .. math:: \text{err}_m = \frac{ \sum\limits_{i=1}^N w_i I\{ y_i \neq G_m(x_i)\} }{ \sum\limits_{i=1}^N w_i} c) Compute the classifier weight .. math:: \alpha_m = \log \left( \frac{ 1-\text{err}_m }{ \text{err}_m } \right) d) Recompute all data weights :math:`w_i` .. math:: w_i = w_i \exp{ \left( \alpha_m I\{ y_i \neq G_m(x_i) \} \right) } 3. The combined classifier is .. _boosting-ada-eval: .. math:: G(x) = \text{sign} \left( \sum\limits_{m=1}^M \alpha_m G_m(x) \right) Interfaces ========== .. autoclass:: AdaBoost :members: :show-inheritance: Examples ========