Abstract
This article deals with an improvement for genetic programming based on a technique originating from the machine learning field: boosting. In a first part of this paper, we test the improvements offered by boosting on binary problems. Then we propose to deal with regression problems, and propose an algorithm, called GPboost, that keeps closer to the original idea of distribution in Adaboost than what has been done in previous implementation of boosting for genetic programming.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Wolfgang Banzhaf, Jason Daida, Agoston Eiben, Max Garzon, Vasant Honavar, Mark Jakiela, and Robert Smith, editors. Proceedings of the Genetic and Evolutionary Computation Conference, Orlando, Florida, USA, july 1999. Morgan-Kaufmann.
H. Drucker. Improving regression using boosting techniques. In Proceedings of International Conference on Machine Learning (ICML97), 1997.
Cyril Fonlupt and Denis Robilliard. Genetic programming with dynamic fitness for a remote sensing application. In [12], pages 191–200, 2000.
Y. Freund. Boosting a weak learning algorithm by majority. Information and Computation, pages 256–285, 1995.
Y. Freund and R. E. Schapire. Experiments with a new boosting algorithm. In Machine Learning: Proceedings of the Thirteenth International Conference, pages 148–156, 1996.
Y. Freund and R. E. Schapire. A decision-theoric generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, pages 119–139, 1997.
Hitoshi Iba. Bagging, boosting, and bloating in genetic programming. In [1], pages 1053–1060, 1999.
J. I. van Hemert J. Eggermont. Adaptive genetic programming applied to new and existing simple regression problems. In Proc. of EuroGP 2001, 2001.
John Koza. Genetic Programming: On the Programming of Computers by Means of Natural Selection. The MIT Press, 1992.
Tom Michael Mitchell. Machine Learning. Mc Graw-Hill, 1997.
R. E. Schapire. The strength of weak learnability. In Machine Learning, 5(2), pages 197–227, 1990.
Marc Schoenauer, Kalyanmo Deb, Günter Rudolph, Xin Yao, Evelyne Lutton, Juan Julian Merelo, and Hans-Paul Schwefel, editors. Parallel Problem Solving from Nature VI, volume 1917 of Lecture Notes in Computer Science, Paris, France, September 2000. Springer.
L. G. Valiant. A theory of learnable. Commun. ACM, 27(11), pages 1134–1142, November 1984.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Paris, G., Robilliard, D., Fonlupt, C. (2002). Applying Boosting Techniques to Genetic Programming. In: Collet, P., Fonlupt, C., Hao, JK., Lutton, E., Schoenauer, M. (eds) Artificial Evolution. EA 2001. Lecture Notes in Computer Science, vol 2310. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-46033-0_22
Download citation
DOI: https://doi.org/10.1007/3-540-46033-0_22
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-43544-0
Online ISBN: 978-3-540-46033-6
eBook Packages: Springer Book Archive