Differential evolution

{{Short description|Method of mathematical optimization}}

{{Evolutionary algorithms}}

File:Ackley.gif.]]

Differential evolution (DE) is an evolutionary algorithm to optimize a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. Such methods are commonly known as metaheuristics as they make few or no assumptions about the optimized problem and can search very large spaces of candidate solutions. However, metaheuristics such as DE do not guarantee an optimal solution is ever found.

DE is used for multidimensional real-valued functions but does not use the gradient of the problem being optimized, which means DE does not require the optimization problem to be differentiable, as is required by classic optimization methods such as gradient descent and quasi-newton methods. DE can therefore also be used on optimization problems that are not even continuous, are noisy, change over time, etc.

DE optimizes a problem by maintaining a population of candidate solutions and creating new candidate solutions by combining existing ones according to its simple formulae, and then keeping whichever candidate solution has the best score or fitness on the optimization problem at hand. In this way, the optimization problem is treated as a black box that merely provides a measure of quality given a candidate solution and the gradient is therefore not needed.

History

Storn and Price introduced Differential Evolution in 1995.{{cite journal |author1=Storn, Rainer |author2=Price, Kenneth |title=Differential evolution—a simple and efficient scheme for global optimization over continuous spaces |date=1995 |journal=International Computer Science Institute|volume=TR | issue=95 |page=TR-95-012 |url=https://cse.engineering.nyu.edu/~mleung/CS909/s04/Storn95-012.pdf |access-date=3 April 2024 |location=Berkeley}} Books have been published on theoretical and practical aspects of using DE in parallel computing, multiobjective optimization, constrained optimization, and the books also contain surveys of application areas. Surveys on the multi-faceted research aspects of DE can be found in journal articles.{{cite journal |author1=S. Das |author2=P. N. Suganthan |url=https://www.researchgate.net/publication/220380793 |title=Differential Evolution: A Survey of the State-of-the-art |journal= IEEE Transactions on Evolutionary Computation|volume=15 |issue=1 |pages=4–31 |date=Feb 2011 |doi=10.1109/TEVC.2010.2059031}}{{cite journal |author1=S. Das |author2=S. S. Mullick |author3=P. N. Suganthan |url=http://web.mysites.ntu.edu.sg/epnsugan/PublicSite/Shared%20Documents/PDFs/DE-Survey-2016.pdf |title=Recent Advances in Differential Evolution - An Updated Survey |journal=Swarm and Evolutionary Computation |doi=10.1016/j.swevo.2016.01.004 |date=2016|volume=27 |pages=1–30 }}

Algorithm {{Anchor|algo}}

A basic variant of the DE algorithm works by having a population of candidate solutions (called agents). These agents are moved around in the search-space by using simple mathematical formulae to combine the positions of existing agents from the population. If the new position of an agent is an improvement then it is accepted and forms part of the population, otherwise the new position is simply discarded. The process is repeated and by doing so it is hoped, but not guaranteed, that a satisfactory solution will eventually be discovered.

Formally, let f: \mathbb{R}^n \to \mathbb{R} be the fitness function which must be minimized (note that maximization can be performed by considering the function h := -f instead). The function takes a candidate solution as argument in the form of a vector of real numbers. It produces a real number as output which indicates the fitness of the given candidate solution. The gradient of f is not known. The goal is to find a solution \mathbf{m} for which f(\mathbf{m}) \leq f(\mathbf{p}) for all \mathbf{p} in the search-space, which means that \mathbf{m} is the global minimum.

Let \mathbf{x} \in \mathbb{R}^n designate a candidate solution (agent) in the population. The basic DE algorithm can then be described as follows:

  • Choose the parameters \text{NP} \geq 4, \text{CR} \in [0,1], and F \in [0,2].
  • NP : \text{NP} is the population size, i.e. the number of candidate agents or "parents".
  • CR : The parameter \text{CR} \in [0,1] is called the crossover probability.
  • F : The parameter F \in [0,2] is called the differential weight.
  • Typical settings are NP = 10n, CR = 0.9 and F = 0.8.
  • Optimization performance may be greatly impacted by these choices; see below.
  • Initialize all agents \mathbf{x} with random positions in the search-space.
  • Until a termination criterion is met (e.g. number of iterations performed, or adequate fitness reached), repeat the following:
  • For each agent \mathbf{x} in the population do:
  • Pick three agents \mathbf{a},\mathbf{b}, and \mathbf{c} from the population at random, they must be distinct from each other as well as from agent \mathbf{x}. (\mathbf{a} is called the "base" vector.)
  • Pick a random index R \in \{1, \ldots, n\} where n is the dimensionality of the problem being optimized.
  • Compute the agent's potentially new position \mathbf{y} = [y_1, \ldots, y_n] as follows:
  • For each i \in \{1,\ldots,n\}, pick a uniformly distributed random number r_i \sim U(0,1)
  • If r_i < CR or i = R then set y_i = a_i + F \times (b_i-c_i) otherwise set y_i = x_i. (Index position R is replaced for certain.)
  • If f(\mathbf{y}) \leq f(\mathbf{x}) then replace the agent \mathbf{x} in the population with the improved or equal candidate solution \mathbf{y}.
  • Pick the agent from the population that has the best fitness and return it as the best found candidate solution.

Parameter selection

Image:DE Meta-Fitness Landscape (Sphere and Rosenbrock).JPG

The choice of DE parameters \text{NP}, \text{CR} and F can have a large impact on optimization performance. Selecting the DE parameters that yield good performance has therefore been the subject of much research. Rules of thumb for parameter selection were devised by Storn et al. and Liu and Lampinen. Mathematical convergence analysis regarding parameter selection was done by Zaharie.

Constraint handling

Differential evolution can be utilized for constrained optimization as well.

A common method involves modifying the target function to include a penalty for any violation of constraints,

expressed as: f\tilde(x) = f(x) + \rho \times \mathrm{{CV}}(x).

Here, \mathrm{{CV}}(x) represents either a constraint violation (an L1 penalty) or the square of a constraint violation (an L2 penalty).

This method, however, has certain drawbacks.

One significant challenge is the appropriate selection of the penalty coefficient \rho.

If \rho is set too low, it may not effectively enforce constraints.

Conversely, if it's too high, it can greatly slow down or even halt the convergence process.

Despite these challenges, this approach remains widely used due to its simplicity and because it doesn't require altering the differential evolution algorithm itself.

There are alternative strategies, such as projecting onto a feasible set or reducing dimensionality, which can be used for box-constrained or linearly constrained cases.

However, in the context of general nonlinear constraints, the most reliable methods typically involve penalty functions.

Variants

Variants of the DE algorithm are continually being developed in an effort to improve optimization performance.{{cite book |author1=Swagatam Das |author2=Sankha Subhra Mullick |author3=P.N. Suganthan |title=Recent advances in differential evolution |date=2016}} The following directions of development can be outlined:

  • New schemes for performing crossover and mutation of agents
  • Various strategies for handling constraints
  • Adaptive strategies that dynamically adjust population size, F and CR parameters
  • Specialized algorithms for large-scale optimization
  • Multi-objective and many-objective algorithms
  • Techniques for handling binary/integer variables

See also

References

{{Reflist|refs=

{{cite journal

|last=Storn

|first=R.

|author2=Price, K.

|title=Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces

|journal=Journal of Global Optimization

|year=1997

|volume=11

|issue=4

|pages=341–359

|doi=10.1023/A:1008202821328

|bibcode=1997JGOpt..11..341S

|s2cid=5297867

}}

{{cite conference

|last=Storn

|first=R.

|s2cid=16576915

|title=On the usage of differential evolution for function optimization

|book-title=Biennial Conference of the North American Fuzzy Information Processing Society (NAFIPS)

|year=1996

|pages=519–523

|doi=10.1109/NAFIPS.1996.534789

}}

{{cite book

|title=Differential Evolution: A Practical Approach to Global Optimization

|url=https://www.springer.com/computer/theoretical+computer+science/foundations+of+computations/book/978-3-540-20950-8

|last1=Price

|first1=K.

|last2=Storn

|first2=R.M.

|last3=Lampinen

|first3=J.A.

|year=2005

|publisher=Springer

|isbn=978-3-540-20950-8

}}

{{cite book

|title=Differential Evolution: In Search of Solutions

|url=https://www.springer.com/mathematics/book/978-0-387-36895-5

|last=Feoktistov

|first=V.

|year=2006

|publisher=Springer

|isbn=978-0-387-36895-5

}}

{{cite book |last1=Onwubolu |first1=G. C. |last2=Babu |first2=B. V. |year=2004 |title=New Optimization Techniques in Engineering |series=Studies in Fuzziness and Soft Computing |volume=141 |isbn=978-3-642-05767-0 |doi=10.1007/978-3-540-39930-8}}

{{citation

|title=Advances in Differential Evolution

|url=https://www.springer.com/engineering/book/978-3-540-68827-3

|editor-last=Chakraborty

|editor-first=U.K.

|year=2008

|publisher=Springer

|isbn=978-3-540-68827-3

}}

{{cite conference

|title=On setting the control parameter of the differential evolution method

|book-title=Proceedings of the 8th International Conference on Soft Computing (MENDEL)

|last1=Liu

|first1=J.

|last2=Lampinen

|first2=J.

|year=2002

|pages=11–18

|location=Brno, Czech Republic

}}

{{cite conference

|title=Critical values for the control parameters of differential evolution algorithms

|book-title=Proceedings of the 8th International Conference on Soft Computing (MENDEL)

|last=Zaharie

|first=D.

|year=2002

|pages=62–67

|location=Brno, Czech Republic

}}

{{Cite journal

|last1=Rocca

|first1=P.

|last2=Oliveri

|first2=G.

|last3=Massa

|first3=A.

|title=Differential Evolution as Applied to Electromagnetics

|journal=IEEE Antennas and Propagation Magazine

|year=2011

|volume=53

|issue=1

|pages=38–49

|doi=10.1109/MAP.2011.5773566

|bibcode=2011IAPM...53...38R

|s2cid=27555808

}}

}}

{{Evolutionary computation}}

{{Major subfields of optimization}}

{{DEFAULTSORT:Differential Evolution}}

Category:Evolutionary algorithms