Many times happens that it take too much time to find well performing machine learning algorithms for your
dataset. Trial and error nature of applied machine
learning is the reason behind it.
Once we have a selected list of accurate models, we can use algorithm tuning to get
the most from each algorithm.
Another
approach that we can use to increase accuracy on our dataset is to combine
the predictions of multiple different models together.
Combine Model Predictions
Into Ensemble Predictions
The three most popular methods for combining the predictions
from different models are:
·
Bagging. Building multiple
models (typically of the same type) from different subsamples of the training dataset.
·
Boosting. Building multiple
models (typically of the same type) each of which learns to fix the prediction
errors of a prior model in the chain.
·
Stacking. Building multiple
models (typically of differing types) and supervisor model that learns how
to best combine the predictions of the primary models.
No comments:
Post a Comment