BERT Mutation: Deep Transformer Model for Masked Uniform Mutation in Genetic Programming
Created by W.Langdon from
gp-bibliography.bib Revision:1.8506
- @Article{shem-tov:2025:Mathematics,
-
author = "Eliad Shem-Tov and Moshe Sipper and Achiya Elyasaf",
-
title = "{BERT} Mutation: Deep Transformer Model for Masked
Uniform Mutation in Genetic Programming",
-
journal = "Mathematics",
-
year = "2025",
-
volume = "13",
-
number = "5",
-
pages = "Article No. 779",
-
keywords = "genetic algorithms, genetic programming",
-
ISSN = "2227-7390",
-
URL = "
https://www.mdpi.com/2227-7390/13/5/779",
-
DOI = "
doi:10.3390/math13050779",
-
abstract = "We introduce BERT mutation, a novel,
domain-independent mutation operator for Genetic
Programming (GP) that leverages advanced Natural
Language Processing (NLP) techniques to improve
convergence, particularly using the Masked Language
Modeling approach. By combining the capabilities of
deep reinforcement learning and the BERT transformer
architecture, BERT mutation intelligently suggests node
replacements within GP trees to enhance their fitness.
Unlike traditional stochastic mutation methods, BERT
mutation adapts dynamically by using historical fitness
data to optimise mutation decisions, resulting in more
effective evolutionary improvements. Through
comprehensive evaluations across three benchmark
domains, we demonstrate that BERT mutation
significantly outperforms conventional and
state-of-the-art mutation operators in terms of
convergence speed and solution quality. This work
represents a pivotal step toward integrating
state-of-the-art deep learning into evolutionary
algorithms, pushing the boundaries of adaptive
optimisation in GP.",
-
notes = "also known as \cite{math13050779}",
- }
Genetic Programming entries for
Eliad Shem-Tov
Moshe Sipper
Achiya Elyasaf
Citations