Bagging ******* .. module:: ailib.fitting .. contents:: Introduction ============ Bagging is mainly used to reduce the prediction variance. Given several classifiers, the bagging approach can be formulated as follows: 1. For all classifiers, do 1. Draw a bootstrap sample set, given all training samples. Bootstrapping means to draw :math:`N` samples with replacement from the :math:`N` original training samples. This implies that all classifiers see the same number of samples but samples may be identical and different classifiers are presented different sample sets. 2. Train the classifier on the bootstrapped sample set. 2. Evaluate a new data point by evaluating all models and then form some kind of consensus answer (majority vote for classification and average for regression problems). Bagging is independent on the type of the trainable models (i.e. can handle classification and regression). See the :class:`implementation ` for details. Interfaces ========== .. autoclass:: Bagging :members: :show-inheritance: Examples ========