Namespaces
Variants
Actions

Difference between revisions of "Genetic Algorithms"

From Encyclopedia of Mathematics
Jump to: navigation, search
m (formatting details)
(refs formatting)
Line 1: Line 1:
 +
 
'''Genetic Algorithms'''
 
'''Genetic Algorithms'''
  
Line 13: Line 14:
 
In the population transformation $\mu ,{\mu}'\in {{\mathbb{Z}}^{+}}$ (the parent and offspring population sizes, respectively). A mapping $T:{{H}^{\mu }}\to{{H}^{{{\mu }'}}}$ is called a PT.  If $T(P)={P}'$ then $P$ is a parent population and ${P}'$ is the offspring population.  If $\mu={\mu }'$ it is simply the population size.
 
In the population transformation $\mu ,{\mu}'\in {{\mathbb{Z}}^{+}}$ (the parent and offspring population sizes, respectively). A mapping $T:{{H}^{\mu }}\to{{H}^{{{\mu }'}}}$ is called a PT.  If $T(P)={P}'$ then $P$ is a parent population and ${P}'$ is the offspring population.  If $\mu={\mu }'$ it is simply the population size.
  
The PT resulting from an GO often depends on the outcome of a random experiment. In Lamont and Merkle (Merkle, 1997) this result is referred to as a random population transformation (RPT or random PT). To define RPT, let $\mu \in {{\mathbb{Z}}^{+}}$and $\Omega $ be a set (the sample space). A random function $R:\Omega \to T({{H}^{\mu }},\bigcup\limits_{{\mu }'\in{{\mathbb{Z}}^{+}}}^{{}}{{{H}^{{{\mu }'}}}})$ is called a RPT. The distribution of PTs resulting from the application of an EO depends on the operator parameters.  In other words, a GO maps its parameters to an RPT.
+
The PT resulting from an GO often depends on the outcome of a random experiment. In {{Cite|LM}} this result is referred to as a random population transformation (RPT or random PT). To define RPT, let $\mu \in {{\mathbb{Z}}^{+}}$and $\Omega $ be a set (the sample space). A random function $R:\Omega \to T({{H}^{\mu }},\bigcup\limits_{{\mu }'\in{{\mathbb{Z}}^{+}}}^{{}}{{{H}^{{{\mu }'}}}})$ is called a RPT. The distribution of PTs resulting from the application of an EO depends on the operator parameters.  In other words, a GO maps its parameters to an RPT.
  
 
Since H is a nonempty set where $c:{{A}^{l}}\to H$, and $f:H\to\Re$, the fitness scaling function can be defined as ${{T}_{s}}:\Re\to \Re$ and a related fitness function a $[\Phi \triangleq {{T}_{s}}\circ f\circ c$. In this definition it is understood that the objective function $f$ is determined by the application.
 
Since H is a nonempty set where $c:{{A}^{l}}\to H$, and $f:H\to\Re$, the fitness scaling function can be defined as ${{T}_{s}}:\Re\to \Re$ and a related fitness function a $[\Phi \triangleq {{T}_{s}}\circ f\circ c$. In this definition it is understood that the objective function $f$ is determined by the application.
Line 26: Line 27:
  
 
Finally, for a selection operator let $s\in GOP\left( H,\mu ,\aleph \times T\left(H,\Re ),\Omega  \right) \right)$.  If $P\in {{H}^{\mu }}$,$\Theta \in \aleph $,$f:H\to \Re $ in all cases, and s satisfies $a\in {{s}_{\left( \Theta,\Phi  \right)}}(P)\Rightarrow a\in P$, then s is a selection operator.
 
Finally, for a selection operator let $s\in GOP\left( H,\mu ,\aleph \times T\left(H,\Re ),\Omega  \right) \right)$.  If $P\in {{H}^{\mu }}$,$\Theta \in \aleph $,$f:H\to \Re $ in all cases, and s satisfies $a\in {{s}_{\left( \Theta,\Phi  \right)}}(P)\Rightarrow a\in P$, then s is a selection operator.
 +
 +
====References====
 +
 +
{|
 +
|valign="top"|{{Ref|LM}}|| Lamont, L. and Merkle, J.D., "A Random Based Framework for Evolutionary Algorithms", in: T. Back, Proceedings of the Seventh International Conference on Genetic Algorithms (1997).
 +
|}

Revision as of 20:39, 22 October 2012

Genetic Algorithms

Genetic algorithms (GAs) (also known as evolutionary algorithms [EAs]) assumes a discrete search space $H$ and a function $f:H\to \Re $, where $H$ is a subset of the Euclidean space $\Re$. The general problem is to find $arg\underset{x\in H}{\mathop{\min }}\,f$, where $x$ is a vector of the decision variables and $f$ is the objective function.

With GAs it is customary to distinguish genotype–the encoded representation of the variables–from phenotype–the set of variables themselves. The vector $x$ is represented by a string (or chromosome) $s$ of length $l$ made up of symbols drawn from an alphabet $A$ using the mapping $c:{{A}^{l}}\to H$. In practice we may need a search space $Xi \subseteq {{A}^{l}}$ to reflect the fact that some strings in the image ${{A}^{l}}$ under $c$ may represent invalid solutions to the problem. The string length $l$ depends on the dimensions of both $H$ and $A$, with the elements of the string corresponding to genes and the values to alleles. This statement of genes and alleles is often referred to as genotype-phenotype mapping.

Given the statements above, the optimization becomes one of finding $arg \underset{S\in L}{\mathop{\min g}}$, where the function $g(s)=f(c(s))$. With GAs it is helpful if $c$ is a bijection. The important property of bijections as they apply to GAs is that bijections have an inverse, i.e., there is a unique vector $x$ for every string and a unique string for each $x$.

The execution of an GA typically begins by randomly sampling with replacement from ${{A}^{l}}$. The resulting collection is the initial population denoted by $P(0)$. In general, a population is a collection $P=({{a}_{1}},{{a}_{2}},...,{{a}_{\mu }})$ of individuals, where ${{a}_{i}}\in {{A}^{l}}$, and populations are treated as $n$-tuples of individuals. The number of individuals ($\mu$) is defined as the population size.

Following initialization, execution proceeds iteratively. Each iteration consists of an application of one or more of the genetic operators (GOs): recombination, mutation and selection. The combined effect of the GOs applied in a particular generation $t\in N$ is to transform the current population $P(t)$ into a new population $P(t+1)$.

In the population transformation $\mu ,{\mu}'\in {{\mathbb{Z}}^{+}}$ (the parent and offspring population sizes, respectively). A mapping $T:{{H}^{\mu }}\to{{H}^{{{\mu }'}}}$ is called a PT. If $T(P)={P}'$ then $P$ is a parent population and ${P}'$ is the offspring population. If $\mu={\mu }'$ it is simply the population size.

The PT resulting from an GO often depends on the outcome of a random experiment. In [LM] this result is referred to as a random population transformation (RPT or random PT). To define RPT, let $\mu \in {{\mathbb{Z}}^{+}}$and $\Omega $ be a set (the sample space). A random function $R:\Omega \to T({{H}^{\mu }},\bigcup\limits_{{\mu }'\in{{\mathbb{Z}}^{+}}}^{{}}{{{H}^{{{\mu }'}}}})$ is called a RPT. The distribution of PTs resulting from the application of an EO depends on the operator parameters. In other words, a GO maps its parameters to an RPT.

Since H is a nonempty set where $c:{{A}^{l}}\to H$, and $f:H\to\Re$, the fitness scaling function can be defined as ${{T}_{s}}:\Re\to \Re$ and a related fitness function a $[\Phi \triangleq {{T}_{s}}\circ f\circ c$. In this definition it is understood that the objective function $f$ is determined by the application.

Now that both the fitness function and RPT have been defined, the GOs can be defined in general: let $\mu \in {{\mathbb{Z}}^{+}}$, $\aleph$ be a set (the parameter space) and $\Omega $ a set (the sample space). The mapping $X:\aleph \to T\left( \Omega ,T\left[ {{H}^{\mu }},\bigcup\limits_{{\mu}'\in {{\mathbb{Z}}^{+}}}^{{}}{{{H}^{{{\mu }'}}}} \right] \right)$ is an EO. The set of GOs is denoted as $GOP\left( H,\mu ,\aleph ,\Omega\right)$.

There are three common GOs: recombination, mutation, and selection. These three operators are roughly analogous to their similarly named counterparts in genetics. The application of them in GAs is strictly Darwin-like in nature, i.e., “survival of the fittest.”

In defining the GOs, we will start with recombination. Let $r\in GOP\left( H,\mu ,\aleph ,\Omega \right)$. If there exist $P\in{{H}^{\mu }},\Theta \in \aleph $ and $\omega\in \Omega $ such that one individual in the offspring population ${{r}_{\Theta }}\left( P\right)$ depends on more than one individual of $P$ then $r$ is referred to as a recombination operator.

A mutation is defined in the following manner: let $m\in GOP\left( H,\mu ,\aleph ,\Omega\right)$. If for every $P\in {{H}^{\mu }}$, for every $\Theta \in X$, for every$\omega \in \Omega $, and if each individual in the offspring population ${{m}_{\Theta}}\left( P \right)$ depends on at most one individual of $P$ then m is called a mutation operator.

Finally, for a selection operator let $s\in GOP\left( H,\mu ,\aleph \times T\left(H,\Re ),\Omega \right) \right)$. If $P\in {{H}^{\mu }}$,$\Theta \in \aleph $,$f:H\to \Re $ in all cases, and s satisfies $a\in {{s}_{\left( \Theta,\Phi \right)}}(P)\Rightarrow a\in P$, then s is a selection operator.

References

[LM] Lamont, L. and Merkle, J.D., "A Random Based Framework for Evolutionary Algorithms", in: T. Back, Proceedings of the Seventh International Conference on Genetic Algorithms (1997).
How to Cite This Entry:
Genetic Algorithms. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Genetic_Algorithms&oldid=28600