Genetic Algorithms
2020 Mathematics Subject Classification: Primary: 90C99,60H25,68Q87 [MSN][ZBL]
$\newcommand{\argmin}[1]{\operatorname{argmin}_{#1}}$ Genetic algorithms (GAs) (also known as evolutionary algorithms [EAs]) assume a discrete search space $H$ and a function $f:H\to \R$. The general problem is to find $\argmin{x\in H}f$, where $x$ is a vector of the decision variables and $f$ is the objective function. With GAs it is customary to distinguish genotype–the encoded representation of the variables–from phenotype–the set of variables themselves. The vector $x$ is represented by a string (or chromosome) $s$ of length $l$ made up of symbols drawn from an alphabet $A$ using the mapping $c:A^l\to H$. In practice we may need a search space $\Xi\subseteq A^l$ to reflect the fact that some strings in the image $A^l$ under $c$ may represent invalid solutions to the problem. The string length $l$ depends on the dimensions of both $H$ and $A$, with the elements of the string corresponding to genes and the values to alleles. This statement of genes and alleles is often referred to as genotype-phenotype mapping. Given the statements above, the optimization becomes one of finding $\argmin{S\in L}$, where the function $g(s)=f(c(s))$. With GAs it is helpful if $c$ is a bijection. The important property of bijections as they apply to GAs is that bijections have an inverse, i.e., there is a unique vector $x$ for every string and a unique string for each $x$.
The execution of an GA typically begins by randomly sampling with replacement from $A^l$. The resulting collection is the initial population denoted by $P(0)$. In general, a population is a collection $P=(a_1,a_2,...,a_\mu)$ of individuals, where $a_i\in A^l$, and populations are treated as $n$-tuples of individuals. The number of individuals ($\mu$) is defined as the population size. Following initialization, execution proceeds iteratively. Each iteration consists of an application of one or more of the genetic operators (GOs): recombination, mutation and selection. The combined effect of the GOs applied in a particular generation $t\in N$ is to transform the current population $P(t)$ into a new population $P(t+1)$. In the population transformation $\mu$, $\mu'\in\Z^+$ (the parent and offspring population sizes, respectively). A mapping $T:H^\mu\to H^{\mu'}$ is called a PT. If $T(P)=P'$ then $P$ is a parent population and $P'$ is the offspring population. If $\mu=\mu'$ it is simply the population size. The PT resulting from a GO often depends on the outcome of a random experiment. In Lamont and Merkel this result is referred to as a random population transformation (RPT or random PT). To define RPT, let $\mu \in \Z^+$and $\Omega $ be a set (the sample space). A random function $R:\Omega \to T(H^\mu,\bigcup\limits_{\mu'\in\Z^+} H^{\mu'})$ is called a RPT. The distribution of PTs resulting from the application of a GO depends on the operator parameters. In other words, a GO maps its parameters to an RPT. Since H is a nonempty set where $c:A^l\to H$, and $f:H\to\R$, the fitness scaling function can be defined as $T_s:\R\to\R$ and a related fitness function $\Phi \triangleq T_s\circ f\circ c$. In this definition it is understood that the objective function $f$ is determined by the application.
Now that both the fitness function and RPT have been defined, the GOs can be defined in general: let $\mu \in \Z^+$, $\aleph$ be a set (the parameter space) and $\Omega $ a set (the sample space). The mapping $X:\aleph \to T\left( \Omega ,T\left[ H^\mu,\bigcup\limits_{\mu'\in \Z^+} H^{\mu'} \right] \right)$ is a GO. The set of GOs is denoted as $GOP( H,\mu ,\aleph ,\Omega)$. There are three common GOs: recombination, mutation, and selection. These three operators are roughly analogous to their similarly named counterparts in genetics. The application of them in GAs is strictly Darwin-like in nature, i.e., “survival of the fittest.” In defining the GOs, we will start with recombination. Let $r\in GOP( H,\mu ,\aleph ,\Omega)$. If there exist $P\in H^\mu$, $\Theta \in \aleph $ and $\omega\in \Omega $ such that one individual in the offspring population $r_\Theta (P)$ depends on more than one individual of $P$ then $r$ is referred to as a recombination operator. A mutation is defined in the following manner: let $m\in GOP( H,\mu ,\aleph ,\Omega)$. If for every $P\in H^\mu $, for every $\Theta \in X$, for every $\omega \in \Omega $, and if each individual in the offspring population $m_\Theta(P)$ depends on at most one individual of $P$ then $m$ is called a mutation operator. Finally, for a selection operator let $s\in GOP( H,\mu ,\aleph \times T(H,\R),\Omega ) )$. If $P\in H^\mu$, $\Theta \in \aleph $, $f:H\to \R $ in all cases, and $s$ satisfies $a\in s_{( \Theta,\Phi)}(P) \Rightarrow a\in P$, then $s$ is a selection operator.
References
[1] Back, T., Evolutionary Algorithms in Theory and Practice, Oxford University Press, 1996
[2] Coello Coello, C.A., Van Veldhuizen, D.A. and Lamont, G.B., Evolutionary Algorithms for Solving Multi-Objective Problems, Kluwer Academic Publishers, 2002
[3] Goldberg, D.E., Genetic Algorithms in Search, Optimization, and Machine Learning, Addison-Wesley Professional, 1989
[4] Holland, J.H., Adaptation in Natural and Artificial Systems, A Bradford Book, 1992
[5] Lamont, L. and Merkle, J.D., "A Random Based Framework for Evolutionary Algorithms", in: T. Back, Proceedings of the Seventh International Conference on Genetic Algorithms, 1997.
[6] Surry, Patrick D. and Radcliffe, N.J., "Formal Algorithms + Formal Representations = Search Strategies", Parallel Problem Solving from Nature IV, Springer-Verlag LNCS 1141, pp 366-375, 1996.
Tgawa63/sandbox. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Tgawa63/sandbox&oldid=28684