Pretraining Reduces Runtime in Denoising Autoencoder Genetic Programming by an Order of Magnitude
Created by W.Langdon from
gp-bibliography.bib Revision:1.7970
- @InProceedings{reiter:2023:GECCOcomp,
-
author = "Johannes Reiter and Dirk Schweim and
David Wittenberg",
-
title = "Pretraining Reduces Runtime in Denoising Autoencoder
Genetic Programming by an Order of Magnitude",
-
booktitle = "GECCO 2023 Student Workshop",
-
year = "2023",
-
editor = "Sara Silva and Luis Paquete and Leonardo Vanneschi and
Nuno Lourenco and Ales Zamuda and Ahmed Kheiri and
Arnaud Liefooghe and Bing Xue and Ying Bi and
Nelishia Pillay and Irene Moser and Arthur Guijt and
Jessica Catarino and Pablo Garcia-Sanchez and
Leonardo Trujillo and Carla Silva and Nadarajen Veerapen",
-
pages = "2382--2385",
-
address = "Lisbon, Portugal",
-
series = "GECCO '23",
-
month = "15-19 " # jul,
-
organisation = "SIGEVO",
-
publisher = "Association for Computing Machinery",
-
publisher_address = "New York, NY, USA",
-
keywords = "genetic algorithms, genetic programming, evolutionary
computation, pretraining, estimation of distribution
genetic programming",
-
isbn13 = "9798400701191",
-
DOI = "doi:10.1145/3583133.3596332",
-
size = "4 pages",
-
abstract = "Denoising autoencoder genetic programming (DAE-GP) is
an estimation of distribution genetic programming
(EDA-GP) algorithm. It uses denoising autoencoder long
short-term memory networks as probabilistic model to
replace the standard mutation and recombination
operators of genetic programming (GP). Recent work has
shown several advantages regarding solution length and
overall performance of DAE-GP when compared to GP.
However, training a neural network at each generation
is computationally expensive, where model training is
the most time consuming process of DAE-GP. In this
work, we propose pretraining to reduce the runtime of
the DAE-GP. In pretraining, the neural network is
trained preceding the evolutionary search. In
experiments on 8 real-world symbolic regression tasks
we find that DAE-GP with pretraining has a reduced
overall runtime of an order of magnitude while
generating individuals with similar or better
fitness.",
-
notes = "GECCO-2023 A Recombination of the 32nd International
Conference on Genetic Algorithms (ICGA) and the 28th
Annual Genetic Programming Conference (GP)",
- }
Genetic Programming entries for
Johannes Reiter
Dirk Schweim
David Wittenberg
Citations