A Logarithmic Distance-Based Multi-Objective Genetic Programming Approach for Classification of Imbalanced Data
Created by W.Langdon from
gp-bibliography.bib Revision:1.8010
- @InProceedings{kumar:2022:AC,
-
author = "Arvind Kumar and Shivani Goel and Nishant Sinha and
Arpit Bhardwaj",
-
title = "A Logarithmic Distance-Based Multi-Objective Genetic
Programming Approach for Classification of Imbalanced
Data",
-
booktitle = "Advanced Computing",
-
year = "2021",
-
editor = "Deepak Garg and Sarangapani Jagannathan and
Ankur Gupta and Lalit Garg and Suneet Gupta",
-
volume = "1528",
-
series = "Communications in Computer and Information Science",
-
pages = "294--304",
-
address = "Malta",
-
month = dec # " 18-19",
-
publisher = "Springer",
-
note = "Revised Selected Papers",
-
keywords = "genetic algorithms, genetic programming, Imbalanced
data classification, Fitness function, Multi-objective
optimization, Pareto front",
-
isbn13 = "978-3-030-95502-1",
-
URL = "http://link.springer.com/chapter/10.1007/978-3-030-95502-1_23",
-
DOI = "doi:10.1007/978-3-030-95502-1_23",
-
abstract = "Standard classification algorithms give biased results
when data sets are imbalanced. Genetic Programming, a
machine learning algorithm based on the evolution of
species in nature, also suffers from the same issue. In
this research work, we introduced a logarithmic
distance-based multi-objective genetic programming
(MOGP) approach for classifying imbalanced data. The
proposed approach uses the logarithmic value of the
distance between predicted and expected values. This
logarithmic value for the minority and the majority
classes is treated as two separate objectives while
learning. In the final generation, the proposed
approach generated a Pareto-front of classifiers with a
balanced surface representing the majority and the
minority class accuracies for binary classification.
The primary advantage of the MOGP technique is that it
can produce a set of good-performing classifiers in a
single experimental execution. Against the MOGP
approach, the canonical GP method requires multiple
experimental runs and a priori objective-based fitness
function. Another benefit of MOGP is that it explicitly
includes the learning bias into the algorithms. For
evaluation of the proposed approach, we performed
extensive experimentation of five imbalanced problems.
The proposed approach results have proven its
superiority over the traditional method, where the
minority and majority class accuracies are taken as two
separate objectives.",
-
notes = "Also known as \cite{kumar2022log}",
- }
Genetic Programming entries for
Arvind Kumar
Shivani Goel
Nishant Sinha
Arpit Bhardwaj
Citations