abstract = "Gradient boosting represents an effective approach for
constructing ensembles. We demonstrate how genetic
programming can take advantage of the method for a wide
range of classification tasks. The resulting Gradient
Boosted Programming approach assumes two phases. Phase
1 develops a diverse set of base learners (programs).
Phase 2 applies a gradient boosting approach specific
to the program representation. The resulting ensemble
is additively constructed and a class probability
distribution learnt for each program. An extensive
benchmarking study is conducted across 21
classification datasets that include requirements for
operation under class imbalance, tens of classes, and
feature identification. The proposed approach is
significantly better under the 11 low cardinality
classification tasks and generally identifies simpler
models than other ensemble methods such as Random
Forests and XGBoost.",