Contents
Bagging is mainly used to reduce the prediction variance. Given several classifiers, the bagging approach can be formulated as follows:
For all classifiers, do
- Draw a bootstrap sample set, given all training samples. Bootstrapping means to draw samples with replacement from the original training samples. This implies that all classifiers see the same number of samples but samples may be identical and different classifiers are presented different sample sets.
- Train the classifier on the bootstrapped sample set.
Evaluate a new data point by evaluating all models and then form some kind of consensus answer (majority vote for classification and average for regression problems).
Bagging is independent on the type of the trainable models (i.e. can handle classification and regression). See the implementation for details.
Bases: ailib.fitting.model.Committee
The Bagging algorithm is implemented as extension of a committee of models. Specifically, the committee is extended by a simple fitting method.
As the evaluation (and error computation) is dependent on the specific problem, use the Committee mixins like so:
>>> class foo(Bagging, Committee.Classification): pass
The models have to be assigned to the instance before training. For this, the method Committee.addModel() can be used.