locating multiple optimal solutions based on multiobjective optimization yong wang [email protected]
TRANSCRIPT
Locating Multiple Optimal Solutions Based on Multiobjective Optimization
Yong Wang [email protected]
2
Part I: Application to Nonlinear Equation Systems (MONES)
Part II: Application to Multimodal Optimization Problems (MOMMOP)
Future Work
Outline of My Talk
3
Part I: Application to Nonlinear Equation Systems (MONES)
Part II: Application to Multimodal Optimization Problems (MOMMOP)
Future Work
Outline of My Talk
4
Nonlinear Equation Systems (NESs) (1/2)
• NESs arise in many science and engineering areas such as chemical processes, robotics, electronic circuits, engineered materials, and physics.
• The formulation of a NES
1
21
1
( ) 0
( ) 0, ( , ) [ , ]
( ) 0
n
n i ii
m
e x
e xx x x S L U
e x
Nonlinear Equation Systems (NESs) (2/2)
• An example
5
2 21 1 2 1 2
2 1 2 1 2
( , ) 1 0
( , ) 0
e x x x x
e x x x x
1 21 , 1x x
-1 -0.5 0 0.5 1-1
-0.5
0
0.5
1
x1
x 2
the optimal solutions
A NES may contain multiple optimal solutions
6
Solving NESs by Evolutionary Algorithms (1/4)
• The aim of solving NESs by evolutionary algorithms (EAs)– Locate all the optimal solutions in a single run
• At present, there are three kinds of methods– Single-objective optimization based methods– Constrained optimization based methods– Multiobjective optimization based methods
7
Solving NESs by Evolutionary Algorithms (2/4)
• Single-objective optimization based methods
• The main drawback– Usually, only one optimal solution can be found in a single
run
1
2
( ) 0
( ) 0
( ) 0m
e x
e x
e x
1minimize | ( ) |
m
iie x
2
1minimize ( )
m
iie x
or
Solving NESs by Evolutionary Algorithms (3/4)
• Constrained optimization based methods
• The main drawbacks– Similar to the first kind of method, this kind of methods
can only locate one optimal solution in a single run– Additional constraint-handling techniques should be
integrated
8
1
2
( ) 0
( ) 0
( ) 0m
e x
e x
e x
or
1minimize | ( ) |
subject to ( ) 0, 1, ,
m
ii
i
e x
e x i m
1minimize | ( ) |
subject to ( ) 0, 1, ,
m
ii
i
e x
e x i m
Solving NESs by Evolutionary Algorithms (4/4)
• Multiobjective optimization based methods (CA method)
• The main drawbacks– It may suffer from the “curse of dimensionality” (i.e.,
many-objective)– Maybe only one solution can be found in a single run
9
C. Grosan and A. Abraham, “A new approach for solving nonlinear equation systems,” IEEE Transactions on Systems Man and Cybernetics - Part A, vol. 38, no. 3, pp. 698-714, 2008.
1
2
( ) 0
( ) 0
( ) 0m
e x
e x
e x
1
2
minimize | ( ) |
minimize ( ) |
minimize | ( ) |m
e x
e x
e x
10
MONES: Multiobjective Optimization for NESs (1/8)
• The main motivation– When solving a NES by EAs, it is expected to locate
multiple optimal solutions in a single run. – Obviously, the above process is similar to that of the
solution of multiobjective optimization problems by EAs. – A question arises naturally is whether a NES can be
transformed into a multiobjective optimization problems and, as a result, multiobjective EAs can be used to solve the transformed problem.
MONES: Multiobjective Optimization for NESs (2/8)
• Multiobjective optimization problems
– Pareto dominance– Pareto optimal solutions
• The set of all the nondominated solutions
– Pareto front• The images of the Pareto optimal
solutions in the objective space
bx
≤
< Pareto dominates
minimize 1 2( ) ( ( ), ( ),..., ( ))mf x f x f x f x
1 2( ) ( ( ), ( ), ..., ( ))a a a m af x f x f x f x
1 2( ) ( ( ), ( ), ..., ( ))b b b m bf x f x f x f x
≤≤ ≤
ax
a bx x
f1
f2
ax
bx
cx
11
12
MONES: Multiobjective Optimization for NESs (3/8)
• The main idea
1
2
( ) 0
( ) 0
( ) 0m
e x
e x
e x
1 11
2 1 1
( ) | ( ) |
( ) 1 * max(| ( ) |, ,| ( ) |)
m
ii
m
f x x e x
f x x m e x e x
① ②
minimize
13
MONES: Multiobjective Optimization for NESs (4/8)
• The principle of the first term
1 1
2 11
x
x
2α
Pareto Front
0
1
1 1α
1
0
2α
1α
1 x
The images of the optimal solutions of the first term in the objective space are located on the line of ‘y=1-x’
minimize
14
MONES: Multiobjective Optimization for NESs (5/8)
• The principle of the second term
* * *1 1 2: If ( , , ) is the solution of a NES, Rema then rk I 0 nx x x
11
2 1
| ( ) |
* max(| ( ) |, ,| ( ) |)
m
ii
m
e x
m e x e x
1 2: The coefficient is used to make and have the siRe mim la ark II r scalem
minimize
15
MONES: Multiobjective Optimization for NESs (6/8)
• The principle of the first term plus the second term
1 11
2 1 1
( ) | ( ) |
( ) 1 * max(| ( ) |, ,| ( ) |)
m
ii
m
f x x e x
f x x m e x e x
The images of the optimal solutions of a NES in the objective space are located on the line of ‘y=1-x’
minimize
Pareto Front
0
1
1
16
MONES: Multiobjective Optimization for NESs (7/8)
• The differences between MONES and CA
2 21 1 2 1 2
2 1 2 1 2
( , ) 1 0
( , ) 0
e x x x x
e x x x x
-1 -0.5 0 0.5 1-1
-0.5
0
0.5
1
x1
x 2
0 0.2 0.4 0.6 0.8 1
0
0.5
1
1.5
2
f1
f 2
-1 0 1 2 3 40
1
2
3
4
5
6
f1
f 2
CA MONES
1 21 , 1x x
17
MONES: Multiobjective Optimization for NESs (8/8)
• The differences between MONES and CA
CA
MONES
1 21 , 1x x
2 21 1 2 1 2
2 1 2 1 2
( , ) 1 0
( , ) 0
e x x x x
e x x x x
-1 -0.5 0 0.5 1
-1
-0.5
0
0.5
1
x1
x 2
A
E
B CD
F
The optimal solutions
0 0.1 0.2 0.3 0.4 0.50
0.5
1
1.5
f1
f 2
DC
F
E
B A
-0.5 0 0.5 1 1.5 20.5
1
1.5
2
2.5
3
3.5
f1
f 2
A
B
C
D
E
F
-1 -0.5 0 0.5 1
-1
-0.5
0
0.5
1
x1
x 2
B
D
The optimal solutions
-1 -0.5 0 0.5 1
-1
-0.5
0
0.5
1
x1
x 2
E
B
The optimal solutions
18
Part I: Application to Nonlinear Equation Systems (MONES)
Part II: Application to Multimodal Optimization Problems (MOMMOP)
Future Work
Outline of My Talk
Multimodal Optimization Problems (MMOPs) (1/2)
• Many optimization problems in the real-world applications exhibit multimodal property, i.e., multiple optimal solutions may coexist.
• The formulation of multimodal optimization problems (MMOPs)
19
11
Maximize/Minimize ( ), ( , ) [ , ]n
n i ii
f x x x x S L U
Multimodal Optimization Problems (MMOPs) (2/2)
• Several examples
20
21
The Previous Work (1/2)
• Niching methods– The first niching method
The preselection method suggested by Cavicchio in 1970
– The current popular niching methods Clearing (Pétrowski, ICEC, 1996) Sharing (Goldberg and Richardson, ICGA, 1987) Crowding (De Jong, PhD dissertation, 1975) restricted tournament selection (Harik, ICGA, 1995) Speciation (Li et al., ECJ, 2002)
• The main disadvantages– Some problem-dependent niching parameters are required
The Previous Work (2/2)
• Multiobjective optimization based methods, usually two objectives are considered:– The first objective: the original multimodal function– The second objective: the distance information (Das et al.,
IEEE TEVC, 2013) or the gradient information (Deb and Saha, ECJ, 2012)
• The disadvantages– It cannot guarantee that the two objectives in the
transformed problem totally conflict with each other– The relationship between the optimal solutions of the
original problems and the Pareto optimal solutions of the transformed problems is difficult to be verified theoretically.
22
23
MOMMOP: Multiobjective Optimization for MMOPs (1/5)
• The main motivation
1 1
2 11
x
x
2α
Pareto Front
0
1
1 1α
minimize
24
MOMMOP: Multiobjective Optimization for MMOPs (2/5)
• The main idea– Convert an MMOP into a biobjective optimization problem
① ②
1 1 1 1
2 1 1 1
| ( ) |( ) ( )
| |
| ( ) |( ) 1 ( )
| |
f x best_referf x x U L
worst_refer best_refer
f x best_referf x x U L
worst_refer best_refer
minimize
25
MOMMOP: Multiobjective Optimization for MMOPs (3/5)
• The principle of the second term
1 1
| ( ) |( )
| |
f x best_referU L
worst_refer best_refer
the objective function value of the best individual found during the evolution
the objective function value of the worst individual found
during the evolution
the objective function value of the current individual
the range of the first variable
Remark: the aim is to make the first term and the second term
have the same scale
the scaling factor
For the optimal solutions of the original multimodal optimization problems, the values of the second term are equal to zero.
26
MOMMOP: Multiobjective Optimization for MMOPs (4/5)
• The principle of the first term plus the second term
The images of the optimal solutions of a multimodal optimization problem in the objective space are located on the line of ‘y=1-x’
Pareto Front
0
1
1
1 1 1 1
2 1 1 1
| ( ) |( ) ( )
| |
| ( ) |( ) 1 ( )
| |
f x best_referf x x U L
worst_refer best_refer
f x best_referf x x U L
worst_refer best_refer
minimize
27
MOMMOP: Multiobjective Optimization for MMOPs (5/5)
• Why does MOMMOP work?– MOMMOP is an implicit niching method
1
2
( ) 0.6 0.2
( ) 1 0.6 0.2
0.8
0.6c
c
f x
f x
x
f(x)
xb xa xc
(0.1, 1)
(0.15, 0.8) (0.6, 0.8)
1
2
( ) 0.15 0.2
(
0.35
1.0) 1 0.15 0.2 5b
b
f x
f x
1
2
( ) 0.1 0.0
( ) 1 0.1 0.0
0.0
0.9a
a
f x
f x
f1
f2 a b
a c
x x
x x
xf(x)
0 1
1
1 1 1 1
2 1 1 1
| ( ) |( ) ( )
| |
| ( ) |( ) 1 ( )
| |
f x best_referf x x U L
worst_refer best_refer
f x best_referf x x U L
worst_refer best_refer
1
10
0 1
1
1
1
1
1
(0.0, 0.9)
(0.35, 1.05)
(0.8, 0.6)
28
Two issues in MOMMOP (1/2)
• The first issue– Some optimal solutions
may have the same value in one or many decision variables
1 1 1 1
2 1 1 1
| ( ) |( ) ( )
| |
| ( ) |( ) 1 ( )
| |
f x best_referf x x U L
worst_refer best_refer
f x best_referf x x U L
worst_refer best_refer
1 2 2 2
2 2 2 2
| ( ) |( ) ( )
| |
| ( ) |( ) 1 ( )
| |
f x best_referf x x U L
worst_refer best_refer
f x best_referf x x U L
worst_refer best_refer
1
2
| ( ) |( ) ( )
| |
| ( ) |( ) 1 ( )
| |
D D D
D D D
f x best_referf x x U L
worst_refer best_refer
f x best_referf x x U L
worst_refer best_refer
a bx x
29
Two issues in MOMMOP (2/2)
• The second issue– In some basins of attraction,
maybe there are few individuals
2( ) ( ) & 0.1a b a bf x f x x x
a bx x
30
Part I: Application to Nonlinear Equation Systems (MONES)
Part II: Application to Multimodal Optimization Problems (MOMMOP)
Future Work
Outline of My Talk
31
Future Work
• We proposed two similar frameworks for nonlinear equation systems and multimodal optimization problems, respectively, however– The principle should be analyzed in depth in the future– The rationality should be further verified– The frameworks could be improved
• Our frameworks could be generalized into solve other kinds of optimization problems