AddGBoost: A gradient boosting-style algorithm based on strong learners

https://doi.org/10.1016/j.mlwa.2021.100243Get rights and content
Under a Creative Commons license
open access

Abstract

We present AddGBoost, a gradient boosting-style algorithm, wherein the decision tree is replaced by a succession of (possibly) stronger learners, which are optimized via a state-of-the-art hyperparameter optimizer. Through experiments over 90 regression datasets we show that AddGBoost emerges as the top performer for 33% (with 2 stages) up to 42% (with 5 stages) of the datasets, when compared with seven well-known machine-learning algorithms: KernelRidge, LassoLars, SGDRegressor, LinearSVR, DecisionTreeRegressor, HistGradientBoostingRegressor, and LGBMRegressor.

Keywords

Gradient boosting
Regression

Cited by (0)