Abstract
In the standard process of creating classification decision trees with genetic programming, the evaluation process it the most time-consuming part of the whole evolution loop. Here we introduce a lazy evaluation approach of classification decision trees in the evolution process, that does not evaluate the whole population but evaluates only the individuals that are chosen to participate in the tournament selection method. Further on, we used dynamic weights for the classification instances, that are linked to the chance of that instance getting picked for the evaluation process and are determined by that instance’s classification rate. These instance weights change based on the misclassification rate of the instance. We thoroughly describe and experiment with the lazy evaluation on standard classification benchmark datasets and show that not only lazy evaluation approach uses less time to evolve the good solution, but can even produce statistically better solution due to changing instance weights and thus preventing the overfitting of the solutions.
Keywords
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Espejo, P.G., Ventura, S., Herrera, F.: A survey on the application of genetic programming to classification. IEEE Trans. Syst. Man Cybern. Part C: Appl. Rev. 40, 121–144 (2010)
Breiman, L., Friedman, J., Stone, C.J., Olshen, R.A.: Classification and Regression Trees. CRC Press, Boca Raton (1984)
Quinlan, J.R.: C4. 5: Programs for Machine Learning. Elsevier, Amsterdam (2014)
Cheng, J., Fayyad, U.M., Irani, K.B., Qian, Z.: Improved decision trees: a generalized version of id3. In: Proceedings of the Fifth International Conference on Machine Learning, pp. 100–107 (1988)
Liaw, A., Wiener, M., et al.: Classification and regression by randomforest. R News 2, 18–22 (2002)
Ganjisaffar, Y., Caruana, R., Lopes, C.V.: Bagging gradient-boosted trees for high precision, low variance ranking models. In: Proceedings of the 34th International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 85–94. ACM (2011)
Zhang, B.T., Cho, D.Y.: Genetic programming with active data selection. In: Asia-Pacific Conference on Simulated Evolution and Learning, pp. 146–153. Springer (1998)
Podgorelec, V., Zorman, M.: Decision tree learning. In: Encyclopedia of Complexity and Systems Science, pp. 1–28. Springer (2015)
Podgorelec, V., Šprogar, M., Pohorec, S.: Evolutionary design of decision trees. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 3, 63–82 (2013)
Karakatič, S., Heričko, M., Podgorelec, V.: Experiments with lazy evaluation of classification decision trees made with genetic programming. In: Proceedings of the 9th International Joint Conference on Computational Intelligence - Volume 1: IJCCI, INSTICC, SciTePress, pp. 348–353 (2017)
Gathercole, C., Ross, P.: Dynamic training subset selection for supervised learning in genetic programming. Parallel Probl. Solving Nat. PPSN III 312–321 (1994)
Šprogar, M.: Excluding fitness helps improve robustness of evolutionary algorithms. In: Knowledge-Based Intelligent Information and Engineering Systems, pp. 905–905. Springer (2005)
Lichman, M.: UCI machine learning repository (2013)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Karakatič, S., Heričko, M., Podgorelec, V. (2019). Improving Genetic Programming for Classification with Lazy Evaluation and Dynamic Weighting. In: Sabourin, C., Merelo, J.J., Madani, K., Warwick, K. (eds) Computational Intelligence. IJCCI 2017. Studies in Computational Intelligence, vol 829. Springer, Cham. https://doi.org/10.1007/978-3-030-16469-0_4
Download citation
DOI: https://doi.org/10.1007/978-3-030-16469-0_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-16468-3
Online ISBN: 978-3-030-16469-0
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)