LLM-Guided Evolution: An Autonomous Model Optimization for Object Detection
Created by W.Langdon from
gp-bibliography.bib Revision:1.8519
- @InProceedings{yu:2025:GECCOcomp2,
-
author = "YiMing Yu and Jason Zutty",
-
title = "{LLM-Guided} Evolution: An Autonomous Model
Optimization for Object Detection",
-
booktitle = "Large Language Models for and with Evolutionary
Computation Workshop",
-
year = "2025",
-
editor = "Erik Hemberg and Roman Senkerik and Joel Lehman and
Una-May O'Reilly and Michal Pluhacek and
Niki {van Stein} and Pier Luca and Tome Eftimov",
-
pages = "2363--2370",
-
address = "Malaga, Spain",
-
series = "GECCO '25 Companion",
-
month = "14-18 " # jul,
-
organisation = "SIGEVO",
-
publisher = "Association for Computing Machinery",
-
publisher_address = "New York, NY, USA",
-
keywords = "genetic algorithms, genetic programming, grammatical
evolution, computer aided/automated design, automated
machine learning, large language models,
neuroevolution, ANN",
-
isbn13 = "979-8-4007-1464-1",
-
URL = "
https://doi.org/10.1145/3712255.3734340",
-
DOI = "
doi:10.1145/3712255.3734340",
-
size = "8 pages",
-
abstract = "In machine learning, Neural Architecture Search (NAS)
requires domain knowledge of model design and a large
amount of trial-and-error to achieve promising
performance. Meanwhile, evolutionary algorithms have
traditionally relied on fixed rules and pre-defined
building blocks. The Large Language Model (LLM)-Guided
Evolution (GE) framework transformed this approach by
incorporating LLMs to directly modify model source code
for image classification algorithms on CIFAR data and
intelligently guide mutations and crossovers. A key
element of LLM-GE is the {"}Evolution of Thought{"}
(EoT) technique, which establishes feedback loops,
allowing LLMs to refine their decisions iteratively
based on how previous operations performed. In this
study, we perform NAS for object detection by improving
LLM-GE to modify the architecture of You Only Look Once
(YOLO) models to enhance performance on the KITTI
dataset. Our approach intelligently adjusts the design
and settings of YOLO to find the optimal algorithms
against objective such as detection accuracy and speed.
We show that LLM-GE produced variants with significant
performance improvements, such as an increase in Mean
Average Precision from 92.5\% to 94.5\%. This result
highlights the flexibility and effectiveness of LLM-GE
on real-world challenges, offering a novel paradigm for
automated machine learning that combines LLM-driven
reasoning with evolutionary strategies.",
-
notes = "GECCO-2025 LLMfwEC workshop A Recombination of the
34th International Conference on Genetic Algorithms
(ICGA) and the 30th Annual Genetic Programming
Conference (GP)",
- }
Genetic Programming entries for
YiMing Yu
Jason Zutty
Citations