test functions for optimization

{{Short description|Functions used to evaluate optimization algorithms}}

In applied mathematics, test functions, known as artificial landscapes, are useful to evaluate characteristics of optimization algorithms, such as convergence rate, precision, robustness and general performance.

Here some test functions are presented with the aim of giving an idea about the different situations that optimization algorithms have to face when coping with these kinds of problems. In the first part, some objective functions for single-objective optimization cases are presented. In the second part, test functions with their respective Pareto fronts for multi-objective optimization problems (MOP) are given.

The artificial landscapes presented herein for single-objective optimization problems are taken from Bäck,{{cite book|last=Bäck|first=Thomas|title=Evolutionary algorithms in theory and practice : evolution strategies, evolutionary programming, genetic algorithms|year=1995|publisher=Oxford University Press|location=Oxford|isbn=978-0-19-509971-3|page=328}} Haupt et al.{{cite book|last=Haupt|first=Randy L. Haupt, Sue Ellen|title=Practical genetic algorithms with CD-Rom|year=2004|publisher=J. Wiley|location=New York|isbn=978-0-471-45565-3|edition=2nd}} and from Rody Oldenhuis software.{{cite web|last=Oldenhuis|first=Rody|title=Many test functions for global optimizers|url=http://www.mathworks.com/matlabcentral/fileexchange/23147-many-testfunctions-for-global-optimizers|publisher=Mathworks|access-date=1 November 2012}} Given the number of problems (55 in total), just a few are presented here.

The test functions used to evaluate the algorithms for MOP were taken from Deb,Deb, Kalyanmoy (2002) Multiobjective optimization using evolutionary algorithms (Repr. ed.). Chichester [u.a.]: Wiley. {{isbn|0-471-87339-X}}. Binh et al.Binh T. and Korn U. (1997) [https://web.archive.org/web/20190801183649/https://pdfs.semanticscholar.org/cf68/41a6848ca2023342519b0e0e536b88bdea1d.pdf MOBES: A Multiobjective Evolution Strategy for Constrained Optimization Problems]. In: Proceedings of the Third International Conference on Genetic Algorithms. Czech Republic. pp. 176–182 and Binh.Binh T. (1999) [https://www.researchgate.net/profile/Thanh_Binh_To/publication/2446107_A_Multiobjective_Evolutionary_Algorithm_The_Study_Cases/links/53eb422f0cf28f342f45251d.pdf A multiobjective evolutionary algorithm. The study cases.] Technical report. Institute for Automation and Communication. Barleben, Germany The software developed by Deb can be downloaded,Deb K. (2011) Software for multi-objective NSGA-II code in C. Available at URL: https://www.iitk.ac.in/kangal/codes.shtml which implements the NSGA-II procedure with GAs, or the program posted on Internet,{{cite web|last=Ortiz|first=Gilberto A.|title=Multi-objective optimization using ES as Evolutionary Algorithm.|url=http://www.mathworks.com/matlabcentral/fileexchange/35824-multi-objective-optimization-using-evolution-strategies-es-as-evolutionary-algorithm-ea|publisher=Mathworks|access-date=1 November 2012}} which implements the NSGA-II procedure with ES.

Just a general form of the equation, a plot of the objective function, boundaries of the object variables and the coordinates of global minima are given herein.

Test functions for single-objective optimization

class="sortable wikitable"

! Name

! Plot

! Formula

! Global minimum

! Search domain

Rastrigin function

| File:Rastrigin contour plot.svg

|f(\mathbf{x}) = A n + \sum_{i=1}^n \left[x_i^2 - A\cos(2 \pi x_i)\right]

\text{where: } A=10

|f(0, \dots, 0) = 0

|-5.12\le x_{i} \le 5.12

Ackley function

| File:Ackley contour function.svg

|f(x,y) = -20\exp\left[-0.2\sqrt{0.5\left(x^{2}+y^{2}\right)}\right]

-\exp\left[0.5\left(\cos 2\pi x + \cos 2\pi y \right)\right] + e + 20

|f(0,0) = 0

|-5\le x,y \le 5

Sphere function

| File:Sphere contour.svg

| f(\boldsymbol{x}) = \sum_{i=1}^{n} x_{i}^{2}

| f(x_{1}, \dots, x_{n}) = f(0, \dots, 0) = 0

| -\infty \le x_{i} \le \infty, 1 \le i \le n

Rosenbrock function

| File:Rosenbrock contour.svg

| f(\boldsymbol{x}) = \sum_{i=1}^{n-1} \left[ 100 \left(x_{i+1} - x_{i}^{2}\right)^{2} + \left(1 - x_{i}\right)^{2}\right]

| \text{Min} =

\begin{cases}

n=2 & \rightarrow \quad f(1,1) = 0, \\

n=3 & \rightarrow \quad f(1,1,1) = 0, \\

n>3 & \rightarrow \quad f(\underbrace{1,\dots,1}_{n \text{ times}}) = 0 \\

\end{cases}

| -\infty \le x_{i} \le \infty, 1 \le i \le n

Beale function

| File:Beale contour.svg

| f(x,y) = \left( 1.5 - x + xy \right)^{2} + \left( 2.25 - x + xy^{2}\right)^{2}

+ \left(2.625 - x+ xy^{3}\right)^{2}

| f(3, 0.5) = 0

| -4.5 \le x,y \le 4.5

Goldstein–Price function

| File:Goldstein-Price contour.svg

| f(x,y) = \left[1+\left(x+y+1\right)^{2}\left(19-14x+3x^{2}-14y+6xy+3y^{2}\right)\right]

\left[30+\left(2x-3y\right)^{2}\left(18-32x+12x^{2}+48y-36xy+27y^{2}\right)\right]

| f(0, -1) = 3

| -2 \le x,y \le 2

Booth function

| File:Booth contour.svg

|f(x,y) = \left( x + 2y -7\right)^{2} + \left(2x +y - 5\right)^{2}

|f(1,3) = 0

|-10 \le x,y \le 10

Bukin function N.6

| File:Bukin 6 contour.svg

| f(x,y) = 100\sqrt{\left|y - 0.01x^{2}\right

+ 0.01 \left|x+10 \right|.\quad

| f(-10,1) = 0

| -15\le x \le -5, -3\le y \le 3

|-

| Matyas function

| File:Matyas contour.svg

| f(x,y) = 0.26 \left( x^{2} + y^{2}\right) - 0.48 xy

| f(0,0) = 0

| -10\le x,y \le 10

|-

| Lévi function N.13

|File:Levi13 contour.svg

| f(x,y) = \sin^{2} 3\pi x + \left(x-1\right)^{2}\left(1+\sin^{2} 3\pi y\right)

+\left(y-1\right)^{2}\left(1+\sin^{2} 2\pi y\right)

| f(1,1) = 0

| -10\le x,y \le 10

|-

| Griewank function

| File:Griewank 2D Contour.svg

| f(x)= 1+ \frac {1}{4000} \sum _{i=1}^n x_i^2 -\prod _{i=1}^n P_i(x_i), where P_i(x_i)=\cos \left( \frac {x_i}{\sqrt {i}} \right)

|f(0, \dots, 0) = 0

|-\infty \le x_{i} \le \infty, 1 \le i \le n

|-

| Himmelblau's function

|File:Himmelblau contour plot.svg

| f(x, y) = (x^2+y-11)^2 + (x+y^2-7)^2.\quad

| \text{Min} =

\begin{cases}

f\left(3.0, 2.0\right) & = 0.0 \\

f\left(-2.805118, 3.131312\right) & = 0.0 \\

f\left(-3.779310, -3.283186\right) & = 0.0 \\

f\left(3.584428, -1.848126\right) & = 0.0 \\

\end{cases}

| -5\le x,y \le 5

|-

| Three-hump camel function

| File:Three-hump-camel contour.svg

| f(x,y) = 2x^{2} - 1.05x^{4} + \frac{x^{6}}{6} + xy + y^{2}

| f(0,0) = 0

| -5\le x,y \le 5

|-

| Easom function

| File:Easom contour.svg

| f(x,y) = -\cos \left(x\right)\cos \left(y\right) \exp\left(-\left(\left(x-\pi\right)^{2} + \left(y-\pi\right)^{2}\right)\right)

| f(\pi , \pi) = -1

| -100\le x,y \le 100

|-

| Cross-in-tray function

| File:Cross-in-tray contour.svg

| f(x,y) = -0.0001 \left[ \left| \sin x \sin y \exp \left(\left|100 - \frac{\sqrt{x^{2} + y^{2}}}{\pi} \right|\right)\right| + 1 \right]^{0.1}

| \text{Min} =

\begin{cases}

f\left(1.34941, -1.34941\right) & = -2.06261 \\

f\left(1.34941, 1.34941\right) & = -2.06261 \\

f\left(-1.34941, 1.34941\right) & = -2.06261 \\

f\left(-1.34941,-1.34941\right) & = -2.06261 \\

\end{cases}

|-10\le x,y \le 10

|-

| Eggholder function{{cite journal | last1=Whitley | first1=Darrell | last2=Rana | first2=Soraya | last3=Dzubera | first3=John | last4=Mathias | first4=Keith E. | title=Evaluating evolutionary algorithms | journal=Artificial Intelligence | publisher=Elsevier BV | volume=85 | issue=1–2 | year=1996 | issn=0004-3702 | doi=10.1016/0004-3702(95)00124-7 | pages=264| doi-access=free }}Vanaret C. (2015) [https://www.researchgate.net/publication/337947149_Hybridization_of_interval_methods_and_evolutionary_algorithms_for_solving_difficult_optimization_problems Hybridization of interval methods and evolutionary algorithms for solving difficult optimization problems.] PhD thesis. Ecole Nationale de l'Aviation Civile. Institut National Polytechnique de Toulouse, France.

| File:Eggholder contour.svg

| f(x,y) = - \left(y+47\right) \sin \sqrt{\left|\frac{x}{2}+\left(y+47\right)\right|} - x \sin \sqrt{\left|x - \left(y + 47 \right)\right|}

| f(512, 404.2319) = -959.6407

| -512\le x,y \le 512

|-

| Hölder table function

| File:Hoelder table contour.svg

| f(x,y) = - \left|\sin x \cos y \exp \left(\left|1 - \frac{\sqrt{x^{2} + y^{2}}}{\pi} \right|\right)\right|

| \text{Min} =

\begin{cases}

f\left(8.05502, 9.66459\right) & = -19.2085 \\

f\left(-8.05502, 9.66459\right) & = -19.2085 \\

f\left(8.05502,-9.66459\right) & = -19.2085 \\

f\left(-8.05502,-9.66459\right) & = -19.2085

\end{cases}

| -10\le x,y \le 10

|-

| McCormick function

| File:McCormick contour.svg

| f(x,y) = \sin \left(x+y\right) + \left(x-y\right)^{2} - 1.5x + 2.5y + 1

| f(-0.54719,-1.54719) = -1.9133

| -1.5\le x \le 4, -3\le y \le 4

|-

| Schaffer function N. 2

| File:Schaffer2 contour.svg

| f(x,y) = 0.5 + \frac{\sin^{2}\left(x^{2} - y^{2}\right) - 0.5}{\left[1 + 0.001\left(x^{2} + y^{2}\right) \right]^{2}}

| f(0, 0) = 0

| -100\le x,y \le 100

|-

| Schaffer function N. 4

| File:Schaffer4 contour.svg

| f(x,y) = 0.5 + \frac{\cos^{2}\left[\sin \left( \left|x^{2} - y^{2}\right|\right)\right] - 0.5}{\left[1 + 0.001\left(x^{2} + y^{2}\right) \right]^{2}}

| \text{Min} =

\begin{cases}

f\left(0,1.25313\right) & = 0.292579 \\

f\left(0,-1.25313\right) & = 0.292579 \\

f\left(1.25313,0\right) & = 0.292579 \\

f\left(-1.25313,0\right) & = 0.292579

\end{cases}

| -100\le x,y \le 100

|-

| Styblinski–Tang function

| File:Styblinski-Tang contour.svg

| f(\boldsymbol{x}) = \frac{\sum_{i=1}^{n} x_{i}^{4} - 16x_{i}^{2} + 5x_{i}}{2}

| -39.16617n < f(\underbrace{-2.903534, \ldots, -2.903534}_{n \text{ times}} ) < -39.16616n

| -5\le x_{i} \le 5, 1\le i \le n..

|-

| Shekel function

| Image:Shekel_2D.jpg

|

f(\vec{x}) = \sum_{i = 1}^{m} \; \left( c_{i} + \sum\limits_{j = 1}^{n} (x_{j} - a_{ji})^2 \right)^{-1}

or, similarly,

f(x_1,x_2,...,x_{n-1},x_n) = \sum_{i = 1}^{m} \; \left( c_{i} + \sum\limits_{j = 1}^{n} (x_{j} - a_{ij})^2 \right)^{-1}

|

| -\infty \le x_{i} \le \infty, 1 \le i \le n

|}

Test functions for constrained optimization

class="wikitable" style="text-align:center"
NamePlotFormulaGlobal minimumSearch domain
Rosenbrock function constrained to a disk{{Cite web|url=https://www.mathworks.com/help/optim/ug/example-nonlinear-constrained-minimization.html?requestedDomain=www.mathworks.com|title=Solve a Constrained Nonlinear Problem - MATLAB & Simulink|website=www.mathworks.com|access-date=2017-08-29}}File:Rosenbrock circle constraint.svgf(x,y) = (1-x)^2 + 100(y-x^2)^2,

subjected to: x^2 + y^2 \le 2

f(1.0,1.0) = 0-1.5\le x \le 1.5, -1.5\le y \le 1.5
Mishra's Bird function - constrained{{Cite web|url=http://www.phoenix-int.com/software/benchmark_report/bird_constrained.php|title=Bird Problem (Constrained) {{!}} Phoenix Integration|access-date=2017-08-29|url-status=bot: unknown|archive-url=https://web.archive.org/web/20161229032528/http://www.phoenix-int.com/software/benchmark_report/bird_constrained.php|archive-date=2016-12-29}}{{Cite journal|last=Mishra|first=Sudhanshu|date=2006|title=Some new test functions for global optimization and performance of repulsive particle swarm method|url=https://mpra.ub.uni-muenchen.de/2718/|journal=MPRA Paper}}File:Mishra bird contour.svgf(x,y) = \sin(y) e^{\left [(1-\cos x)^2\right]} + \cos(x) e^{\left [(1-\sin y)^2 \right]} + (x-y)^2,

subjected to: (x+5)^2 + (y+5)^2 < 25

f(-3.1302468,-1.5821422) = -106.7645367-10\le x \le 0, -6.5\le y \le 0
Townsend function (modified){{Cite web|url=http://www.chebfun.org/examples/opt/ConstrainedOptimization.html|title=Constrained optimization in Chebfun|last=Townsend|first=Alex|date=January 2014|website=chebfun.org|access-date=2017-08-29}}File:Townsend contour.svgf(x,y) = -[\cos((x-0.1)y)]^2 - x \sin(3x+y),

subjected to:x^2+y^2 < \left[2\cos t - \frac 1 2 \cos 2t - \frac 1 4 \cos 3t - \frac 1 8 \cos 4t\right]^2 + [2\sin t]^2

where: {{Math|1=t = Atan2(x,y)}}

f(2.0052938,1.1944509) = -2.0239884-2.25\le x \le 2.25, -2.5\le y \le 1.75
Keane's bump function{{anchor|Keane's bump function}}{{cite journal |last1=Mishra |first1=Sudhanshu |title=Minimization of Keane’s Bump Function by the Repulsive Particle Swarm and the Differential Evolution Methods |date=5 May 2007 |url=https://econpapers.repec.org/paper/pramprapa/3098.htm |journal=MPRA Paper|publisher=University Library of Munich, Germany}}File:Estimation of Distribution Algorithm animation.giff(x) = -\left| \frac{\left[ \sum_{i=1}^m cos^4 (x_i) - 2 \prod_{i=1}^m cos^2 (x_i) \right]}{{\left( \sum_{i=1}^m ix^2_i \right)}^{0.5}} \right| ,

subjected to: 0.75 - \prod_{i=1}^m x_i < 0 , and

\sum_{i=1}^m x_i - 7.5m < 0

f((1.60025376,0.468675907)) = -0.3649797460 < x_i < 10

Test functions for multi-objective optimization

{{explain|reason=What does it mean to minimize two objective functions?|date=September 2016}}

class="wikitable" style="text-align:center"
NamePlotFunctionsConstraintsSearch domain
Binh and Korn function:File:Binh and Korn function.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(x,y\right) = 4x^{2} + 4y^{2} \\

f_{2}\left(x,y\right) = \left(x - 5\right)^{2} + \left(y - 5\right)^{2} \\

\end{cases}

\text{s.t.} =

\begin{cases}

g_{1}\left(x,y\right) = \left(x - 5\right)^{2} + y^{2} \leq 25 \\

g_{2}\left(x,y\right) = \left(x - 8\right)^{2} + \left(y + 3\right)^{2} \geq 7.7 \\

\end{cases}

0\le x \le 5, 0\le y \le 3
Chankong and Haimes function:{{cite book |last1=Chankong |first1=Vira |last2=Haimes |first2=Yacov Y. |title=Multiobjective decision making. Theory and methodology. |isbn=0-444-00710-5|year=1983 |publisher=North Holland }}File:Chakong and Haimes function.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(x,y\right) = 2 + \left(x-2\right)^{2} + \left(y-1\right)^{2} \\

f_{2}\left(x,y\right) = 9x - \left(y - 1\right)^{2} \\

\end{cases}

\text{s.t.} =

\begin{cases}

g_{1}\left(x,y\right) = x^{2} + y^{2} \leq 225 \\

g_{2}\left(x,y\right) = x - 3y + 10 \leq 0 \\

\end{cases}

-20\le x,y \le 20
Fonseca–Fleming function:{{cite journal |first1=C. M. |last1=Fonseca |first2=P. J. |last2=Fleming |title=An Overview of Evolutionary Algorithms in Multiobjective Optimization |journal=Evol Comput |volume=3 |issue=1 |pages=1–16 |year=1995 |doi=10.1162/evco.1995.3.1.1 |citeseerx=10.1.1.50.7779 |s2cid=8530790 }}File:Fonseca and Fleming function.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(\boldsymbol{x}\right) = 1 - \exp \left[-\sum_{i=1}^{n} \left(x_{i} - \frac{1}{\sqrt{n}} \right)^{2} \right] \\

f_{2}\left(\boldsymbol{x}\right) = 1 - \exp \left[-\sum_{i=1}^{n} \left(x_{i} + \frac{1}{\sqrt{n}} \right)^{2} \right] \\

\end{cases}

-4\le x_{i} \le 4, 1\le i \le n
Test function 4:File:Test function 4 - Binh.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(x,y\right) = x^{2} - y \\

f_{2}\left(x,y\right) = -0.5x - y - 1 \\

\end{cases}

\text{s.t.} =

\begin{cases}

g_{1}\left(x,y\right) = 6.5 - \frac{x}{6} - y \geq 0 \\

g_{2}\left(x,y\right) = 7.5 - 0.5x - y \geq 0 \\

g_{3}\left(x,y\right) = 30 - 5x - y \geq 0 \\

\end{cases}

-7\le x,y \le 4
Kursawe function:F. Kursawe, “[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.47.8050&rep=rep1&type=pdf A variant of evolution strategies for vector optimization],” in PPSN I, Vol 496 Lect Notes in Comput Sc. Springer-Verlag, 1991, pp. 193–197.File:Kursawe function.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(\boldsymbol{x}\right) = \sum_{i=1}^{2} \left[-10 \exp \left(-0.2 \sqrt{x_{i}^{2} + x_{i+1}^{2}} \right) \right] \\

& \\

f_{2}\left(\boldsymbol{x}\right) = \sum_{i=1}^{3} \left[\left|x_{i}\right|^{0.8} + 5 \sin \left(x_{i}^{3} \right) \right] \\

\end{cases}

-5\le x_{i} \le 5, 1\le i \le 3.
Schaffer function N. 1:{{cite book |last=Schaffer |first=J. David |date=1984 |chapter=Multiple Objective Optimization with Vector Evaluated Genetic Algorithms |title=Proceedings of the First International Conference on Genetic Algorithms |editor1=G.J.E Grefensette |editor2=J.J. Lawrence Erlbraum |oclc=20004572 }}File:Schaffer function 1.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(x\right) = x^{2} \\

f_{2}\left(x\right) = \left(x-2\right)^{2} \\

\end{cases}

-A\le x \le A. Values of A from 10 to 10^{5} have been used successfully. Higher values of A increase the difficulty of the problem.
Schaffer function N. 2:File:Schaffer function 2 - multi-objective.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(x\right) = \begin{cases}

-x, & \text{if } x \le 1 \\

x-2, & \text{if } 1 < x \le 3 \\

4-x, & \text{if } 3 < x \le 4 \\

x-4, & \text{if } x > 4 \\

\end{cases} \\

f_{2}\left(x\right) = \left(x-5\right)^{2} \\

\end{cases}

-5\le x \le 10.
Poloni's two objective function:File:Poloni's two objective function.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(x,y\right) = \left[1 + \left(A_{1} - B_{1}\left(x,y\right) \right)^{2} + \left(A_{2} - B_{2}\left(x,y\right) \right)^{2} \right] \\

f_{2}\left(x,y\right) = \left(x + 3\right)^{2} + \left(y + 1 \right)^{2} \\

\end{cases}

\text{where} =

\begin{cases}

A_{1} = 0.5 \sin \left(1\right) - 2 \cos \left(1\right) + \sin \left(2\right) - 1.5 \cos \left(2\right) \\

A_{2} = 1.5 \sin \left(1\right) - \cos \left(1\right) + 2 \sin \left(2\right) - 0.5 \cos \left(2\right) \\

B_{1}\left(x,y\right) = 0.5 \sin \left(x\right) - 2 \cos \left(x\right) + \sin \left(y\right) - 1.5 \cos \left(y\right) \\

B_{2}\left(x,y\right) = 1.5 \sin \left(x\right) - \cos \left(x\right) + 2 \sin \left(y\right) - 0.5 \cos \left(y\right)

\end{cases}

-\pi\le x,y \le \pi
Zitzler–Deb–Thiele's function N. 1:{{cite book |last1=Deb |first1=Kalyan |last2=Thiele |first2=L. |last3=Laumanns |first3=Marco |last4=Zitzler |first4=Eckart |title=Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600) |chapter=Scalable multi-objective optimization test problems |date=2002 |volume=1 |pages=825–830 |doi=10.1109/CEC.2002.1007032|isbn=0-7803-7282-4 |s2cid=61001583 }}File:Zitzler-Deb-Thiele's function 1.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(\boldsymbol{x}\right) = x_{1} \\

f_{2}\left(\boldsymbol{x}\right) = g\left(\boldsymbol{x}\right) h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) \\

g\left(\boldsymbol{x}\right) = 1 + \frac{9}{29} \sum_{i=2}^{30} x_{i} \\

h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) = 1 - \sqrt{\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x}\right)}} \\

\end{cases}

0\le x_{i} \le 1, 1\le i \le 30.
Zitzler–Deb–Thiele's function N. 2:File:Zitzler-Deb-Thiele's function 2.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(\boldsymbol{x}\right) = x_{1} \\

f_{2}\left(\boldsymbol{x}\right) = g\left(\boldsymbol{x}\right) h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) \\

g\left(\boldsymbol{x}\right) = 1 + \frac{9}{29} \sum_{i=2}^{30} x_{i} \\

h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) = 1 - \left(\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x}\right)}\right)^{2} \\

\end{cases}

0\le x_{i} \le 1, 1\le i \le 30.
Zitzler–Deb–Thiele's function N. 3:File:Zitzler-Deb-Thiele's function 3.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(\boldsymbol{x}\right) = x_{1} \\

f_{2}\left(\boldsymbol{x}\right) = g\left(\boldsymbol{x}\right) h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) \\

g\left(\boldsymbol{x}\right) = 1 + \frac{9}{29} \sum_{i=2}^{30} x_{i} \\

h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) = 1 - \sqrt{\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x} \right)}} - \left(\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x}\right)} \right) \sin \left(10 \pi f_{1} \left(\boldsymbol{x} \right) \right)

\end{cases}

0\le x_{i} \le 1, 1\le i \le 30.
Zitzler–Deb–Thiele's function N. 4:File:Zitzler-Deb-Thiele's function 4.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(\boldsymbol{x}\right) = x_{1} \\

f_{2}\left(\boldsymbol{x}\right) = g\left(\boldsymbol{x}\right) h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) \\

g\left(\boldsymbol{x}\right) = 91 + \sum_{i=2}^{10} \left(x_{i}^{2} - 10 \cos \left(4 \pi x_{i}\right) \right) \\

h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) = 1 - \sqrt{\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x} \right)}}

\end{cases}

0\le x_{1} \le 1, -5\le x_{i} \le 5, 2\le i \le 10
Zitzler–Deb–Thiele's function N. 6:File:Zitzler-Deb-Thiele's function 6.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(\boldsymbol{x}\right) = 1 - \exp \left(-4x_{1}\right)\sin^{6}\left(6 \pi x_{1} \right) \\

f_{2}\left(\boldsymbol{x}\right) = g\left(\boldsymbol{x}\right) h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) \\

g\left(\boldsymbol{x}\right) = 1 + 9 \left[\frac{\sum_{i=2}^{10} x_{i}}{9}\right]^{0.25} \\

h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) = 1 - \left(\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x} \right)}\right)^{2} \\

\end{cases}

0\le x_{i} \le 1, 1\le i \le 10.
Osyczka and Kundu function:{{cite journal |last1=Osyczka |first1=A. |last2=Kundu |first2=S. |title=A new method to solve generalized multicriteria optimization problems using the simple genetic algorithm |journal=Structural Optimization |date=1 October 1995 |volume=10 |issue=2 |pages=94–99 |doi=10.1007/BF01743536 |s2cid=123433499 |issn=1615-1488}}File:Osyczka and Kundu function.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(\boldsymbol{x}\right) = -25 \left(x_{1}-2\right)^{2} - \left(x_{2}-2\right)^{2} - \left(x_{3}-1\right)^{2}

- \left(x_{4}-4\right)^{2} - \left(x_{5}-1\right)^{2} \\

f_{2}\left(\boldsymbol{x}\right) = \sum_{i=1}^{6} x_{i}^{2} \\

\end{cases}

\text{s.t.} =

\begin{cases}

g_{1}\left(\boldsymbol{x}\right) = x_{1} + x_{2} - 2 \geq 0 \\

g_{2}\left(\boldsymbol{x}\right) = 6 - x_{1} - x_{2} \geq 0 \\

g_{3}\left(\boldsymbol{x}\right) = 2 - x_{2} + x_{1} \geq 0 \\

g_{4}\left(\boldsymbol{x}\right) = 2 - x_{1} + 3x_{2} \geq 0 \\

g_{5}\left(\boldsymbol{x}\right) = 4 - \left(x_{3}-3\right)^{2} - x_{4} \geq 0 \\

g_{6}\left(\boldsymbol{x}\right) = \left(x_{5} - 3\right)^{2} + x_{6} - 4 \geq 0

\end{cases}

0\le x_{1},x_{2},x_{6} \le 10, 1\le x_{3},x_{5} \le 5, 0\le x_{4} \le 6.
CTP1 function (2 variables):{{cite book |last1=Jimenez |first1=F. |last2=Gomez-Skarmeta |first2=A. F. |last3=Sanchez |first3=G. |last4=Deb |first4=K. |title=Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600) |chapter=An evolutionary algorithm for constrained multi-objective optimization |date=May 2002 |volume=2 |pages=1133–1138 |doi=10.1109/CEC.2002.1004402|isbn=0-7803-7282-4 |s2cid=56563996 }}File:CTP1 function (2 variables).pdf\text{Minimize} =

\begin{cases}

f_{1}\left(x,y\right) = x \\

f_{2}\left(x,y\right) = \left(1 + y\right) \exp \left(-\frac{x}{1+y} \right)

\end{cases}

\text{s.t.} =

\begin{cases}

g_{1}\left(x,y\right) = \frac{f_{2}\left(x,y\right)}{0.858 \exp \left(-0.541 f_{1}\left(x,y\right)\right)} \geq 1 \\

g_{2}\left(x,y\right) = \frac{f_{2}\left(x,y\right)}{0.728 \exp \left(-0.295 f_{1}\left(x,y\right)\right)} \geq 1

\end{cases}

0\le x,y \le 1.
Constr-Ex problem:File:Constr-Ex problem.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(x,y\right) = x \\

f_{2}\left(x,y\right) = \frac{1 + y}{x} \\

\end{cases}

\text{s.t.} =

\begin{cases}

g_{1}\left(x,y\right) = y + 9x \geq 6 \\

g_{2}\left(x,y\right) = -y + 9x \geq 1 \\

\end{cases}

0.1\le x \le 1, 0\le y \le 5
Viennet function:File:Viennet function.pdf\text{Minimize} =

\begin{cases}

f_{1}\left(x,y\right) = 0.5\left(x^{2} + y^{2}\right) + \sin\left(x^{2} + y^{2} \right) \\

f_{2}\left(x,y\right) = \frac{\left(3x - 2y + 4\right)^{2}}{8} + \frac{\left(x - y + 1\right)^{2}}{27} + 15 \\

f_{3}\left(x,y\right) = \frac{1}{x^{2} + y^{2} + 1} - 1.1 \exp \left(- \left(x^{2} + y^{2} \right) \right) \\

\end{cases}

-3\le x,y \le 3.

References