Comparing Optimistic and Pessimistic Constraint Evaluation in Shape-constrained Symbolic Regression
Created by W.Langdon from
gp-bibliography.bib Revision:1.8129
- @InProceedings{haider:2022:GECCO,
-
author = "Christian Haider and Fabricio {De Franca} and
Gabriel Kronberger and Bogdan Burlacu",
-
title = "Comparing Optimistic and Pessimistic Constraint
Evaluation in Shape-constrained Symbolic Regression",
-
booktitle = "Proceedings of the 2022 Genetic and Evolutionary
Computation Conference",
-
year = "2022",
-
editor = "Alma Rahat and Jonathan Fieldsend and
Markus Wagner and Sara Tari and Nelishia Pillay and Irene Moser and
Aldeida Aleti and Ales Zamuda and Ahmed Kheiri and
Erik Hemberg and Christopher Cleghorn and Chao-li Sun and
Georgios Yannakakis and Nicolas Bredeche and
Gabriela Ochoa and Bilel Derbel and Gisele L. Pappa and
Sebastian Risi and Laetitia Jourdan and
Hiroyuki Sato and Petr Posik and Ofer Shir and Renato Tinos and
John Woodward and Malcolm Heywood and Elizabeth Wanner and
Leonardo Trujillo and Domagoj Jakobovic and
Risto Miikkulainen and Bing Xue and Aneta Neumann and
Richard Allmendinger and Inmaculada Medina-Bulo and
Slim Bechikh and Andrew M. Sutton and
Pietro Simone Oliveto",
-
pages = "938--945",
-
address = "Boston, USA",
-
series = "GECCO '22",
-
month = "9-13 " # jul,
-
organisation = "SIGEVO",
-
publisher = "Association for Computing Machinery",
-
publisher_address = "New York, NY, USA",
-
keywords = "genetic algorithms, genetic programming, prior
knowledge, symbolic regression, shape constraints",
-
isbn13 = "978-1-4503-9237-2",
-
DOI = "doi:10.1145/3512290.3528714",
-
abstract = "Shape-constrained Symbolic Regression integrates prior
knowledge about the function shape into the symbolic
regression model. This can be used to enforce that the
model has desired properties such as monotonicity, or
convexity, among others. Shape-constrained Symbolic
Regression can also help to create models with better
extrapolation behavior and reduced sensitivity to
noise. The constraint evaluation can be challenging
because exact evaluation of constraints may require a
search for the extrema of non-convex functions.
Approximations via interval arithmetic allow to
efficiently find bounds for the extrema of functions.
However, interval arithmetic can lead to overly wide
bounds and therefore produces a pessimistic estimation.
Another possibility is to use sampling which
underestimates the true range. Sampling therefore
produces an optimistic estimation. In this paper we
evaluate both methods and compare them on different
problem instances. In particular we evaluate the
sensitivity to noise and the extrapolation capabilities
in combination with noise data. The results indicate
that the optimistic approach works better for
predicting out-of-domain points (extrapolation) and the
pessimistic approach works better for high noise
levels.",
-
notes = "GECCO-2022 A Recombination of the 31st International
Conference on Genetic Algorithms (ICGA) and the 27th
Annual Genetic Programming Conference (GP)",
- }
Genetic Programming entries for
Christian Haider
Fabricio Olivetti de Franca
Gabriel Kronberger
Bogdan Burlacu
Citations