[ieee 2013 9th international conference on natural computation (icnc) - shenyang, china...

5
978-1-4673-4714-3/13/$31.00 ©2013 IEEE 374 2013 Ninth International Conference on Natural Computation (ICNC) A Scalability Test of Gaussian Bare-Bones Differential Evolution on High-Dimensional Optimization Problems Hui Wang 1,2 , Zhengwang Xiao 1 , Yunhui Zhang 1 , Xiaowu Liao 1 , Yiping Ruan 1 , Jingjiu Fu 1 , Dazhi Jiang 3 1 School of Information Engineering, Nanchang Institute of Technology, Nanchang 330099, China 2 State Key Laboratory of Software Engineering, Wuhan University, Wuhan 430072, China 3 Department of Computer Science, Shantou University, Shantou 515063, China Email: [email protected] Abstract—Gaussian bare-bones differential evolution (GBDE) is a new differential evolution (DE) variant, which has shown good search abilities on low-dimensional optimization problems. In this paper, we present a scalability test of GBDE on high- dimensional optimization problems with dimensions 100, 200, 500, and 1000. Experimental results show that GBDE achieves promising results on the majority of test functions. Keywords-differential evolution; evolutionary computation; high-dimensional; global optimization I. INTRODUCTION Differential evolution (DE) [1] is an efficient optimization algorithm, which has been successfully applied to solve many real-world and benchmark optimization problems. However, its performance highly depends on the settings of control parameters, such as the mutation scale factor (F) and crossover rate (CR) [2]. Although some previous works have proposed different parameter settings, these methods lack the generality. The main reason is that those parameters are problem-oriented and fixed parameter settings are not a good choice. In order to reduce the effects of the control parameters in DE, some adaptive mechanisms have been proposed in the past several years [3-6]. Although these strategies are useful to improve the performance of DE and avoid choosing parameters based on trial and error process, many of them introduce additional complex adaptive operations which usually are not easy to implement. In our previous work [7], a new DE algorithm called Gaussian bare-bones DE (GBDE) is proposed to minimize the effects of the control parameters. Experimental results show that GBDE achieves good results on many low- dimensional optimization problems. With the development of economy, the dimensionality of optimization problems is increasingly large, and most optimization algorithms suffer from the curse of dimensionality. It implies that their performance deteriorates quickly as the dimension of the problem increases. The main reason is that in general the complexity of the problem increases exponentially with its dimension. Most evolutionary algorithms including DE lose the power of searching the optima solution when the dimension increases. Therefore, more efficient optimization algorithms are needed to explore all the promising regions in a given time budget [8]. To solve high-dimensional optimization problems, some new algorithms have been designed. Yang et al. proposed a multilevel cooperative co-evolution algorithm (MLCC) based on self-adaptive neighborhood search DE to solve large-scale problems [9]. The presented results show that MLCC could achieve promising solutions. Brest et al. introduced a population size reduction mechanism into self-adaptive DE, where the population size decreases during the evolutionary process [10]. Recently, Brest and Maucec proposed another version of jDEdynNP-F by employing three new mutation schemes [11]. Rahnamayan and Wang presented a experimental study of opposition-based DE (ODE) [12] on large-scale problems [13]. The reported results show that ODE significantly improves the performance of standard DE. Muelas et al. [14] used a local search mechanism to improve the solutions obtained by DE. Wang et al. [15] used an enhanced ODE based on generalized opposition-based learning (GODE) to solve scalable benchmark functions. Zhao et al. [16] combined self-adaptive DE and multiple trajectory search (MTS) for large-scale optimization, which incorporates DE/current-to-pbest [5] mutation strategy and hybridized with modified MTS. Wang et al. [17] proposed a sequential DE enhanced by neighborhood search (SDENS), which is based on global and local mutation [6]. In this paper, we present a scalability test of GBDE on nineteen benchmark functions for D=100, 200, 500 and 1000. Experimental results show that GBDE outperforms DE, CHC, and G-CMA-ES on the majority of test functions. The rest of the paper is organized as follows. In Section II, the DE algorithm is briefly introduced. Section III describes the GBDE algorithm. In Section IV, experimental studies are presented. Finally, the work is concluded in Section V. II. DIFFERENTIAL EVOLUTION Like other Evolutionary Algorithms (EAs), DE is a population-based stochastic search algorithm. It starts with a population of N p vectors representing the candidate solutions, where N p indicates the population size. Let us assume that X i,G = [x i,1,G , x i,2,G ,…, x i,D,G ] is the ith candidate solution vector in generation G, where i=1,2,…, N p , D is the problem's dimension, and G is the generation index. For the classical DE,

Upload: dazhi

Post on 09-Mar-2017

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: [IEEE 2013 9th International Conference on Natural Computation (ICNC) - Shenyang, China (2013.07.23-2013.07.25)] 2013 Ninth International Conference on Natural Computation (ICNC) -

978-1-4673-4714-3/13/$31.00 ©2013 IEEE 374

2013 Ninth International Conference on Natural Computation (ICNC)

A Scalability Test of Gaussian Bare-Bones Differential Evolution on High-Dimensional

Optimization Problems

Hui Wang1,2, Zhengwang Xiao1, Yunhui Zhang1, Xiaowu Liao1, Yiping Ruan1, Jingjiu Fu1, Dazhi Jiang3 1School of Information Engineering, Nanchang Institute of Technology, Nanchang 330099, China

2State Key Laboratory of Software Engineering, Wuhan University, Wuhan 430072, China 3Department of Computer Science, Shantou University, Shantou 515063, China

Email: [email protected]

Abstract—Gaussian bare-bones differential evolution (GBDE) is a new differential evolution (DE) variant, which has shown good search abilities on low-dimensional optimization problems. In this paper, we present a scalability test of GBDE on high-dimensional optimization problems with dimensions 100, 200, 500, and 1000. Experimental results show that GBDE achieves promising results on the majority of test functions.

Keywords-differential evolution; evolutionary computation; high-dimensional; global optimization

I. INTRODUCTION Differential evolution (DE) [1] is an efficient optimization

algorithm, which has been successfully applied to solve many real-world and benchmark optimization problems. However, its performance highly depends on the settings of control parameters, such as the mutation scale factor (F) and crossover rate (CR) [2]. Although some previous works have proposed different parameter settings, these methods lack the generality. The main reason is that those parameters are problem-oriented and fixed parameter settings are not a good choice.

In order to reduce the effects of the control parameters in DE, some adaptive mechanisms have been proposed in the past several years [3-6]. Although these strategies are useful to improve the performance of DE and avoid choosing parameters based on trial and error process, many of them introduce additional complex adaptive operations which usually are not easy to implement. In our previous work [7], a new DE algorithm called Gaussian bare-bones DE (GBDE) is proposed to minimize the effects of the control parameters. Experimental results show that GBDE achieves good results on many low-dimensional optimization problems.

With the development of economy, the dimensionality of optimization problems is increasingly large, and most optimization algorithms suffer from the curse of dimensionality. It implies that their performance deteriorates quickly as the dimension of the problem increases. The main reason is that in general the complexity of the problem increases exponentially with its dimension. Most evolutionary algorithms including DE lose the power of searching the optima solution when the dimension increases. Therefore, more efficient optimization algorithms are needed to explore all the promising regions in a given time budget [8].

To solve high-dimensional optimization problems, some new algorithms have been designed. Yang et al. proposed a multilevel cooperative co-evolution algorithm (MLCC) based on self-adaptive neighborhood search DE to solve large-scale problems [9]. The presented results show that MLCC could achieve promising solutions. Brest et al. introduced a population size reduction mechanism into self-adaptive DE, where the population size decreases during the evolutionary process [10]. Recently, Brest and Maucec proposed another version of jDEdynNP-F by employing three new mutation schemes [11]. Rahnamayan and Wang presented a experimental study of opposition-based DE (ODE) [12] on large-scale problems [13]. The reported results show that ODE significantly improves the performance of standard DE. Muelas et al. [14] used a local search mechanism to improve the solutions obtained by DE. Wang et al. [15] used an enhanced ODE based on generalized opposition-based learning (GODE) to solve scalable benchmark functions. Zhao et al. [16] combined self-adaptive DE and multiple trajectory search (MTS) for large-scale optimization, which incorporates DE/current-to-pbest [5] mutation strategy and hybridized with modified MTS. Wang et al. [17] proposed a sequential DE enhanced by neighborhood search (SDENS), which is based on global and local mutation [6].

In this paper, we present a scalability test of GBDE on nineteen benchmark functions for D=100, 200, 500 and 1000. Experimental results show that GBDE outperforms DE, CHC, and G-CMA-ES on the majority of test functions.

The rest of the paper is organized as follows. In Section II, the DE algorithm is briefly introduced. Section III describes the GBDE algorithm. In Section IV, experimental studies are presented. Finally, the work is concluded in Section V.

II. DIFFERENTIAL EVOLUTION Like other Evolutionary Algorithms (EAs), DE is a

population-based stochastic search algorithm. It starts with a population of Np vectors representing the candidate solutions, where Np indicates the population size. Let us assume that Xi,G= [xi,1,G, xi,2,G,…, xi,D,G] is the ith candidate solution vector in generation G, where i=1,2,…, Np, D is the problem's dimension, and G is the generation index. For the classical DE,

Page 2: [IEEE 2013 9th International Conference on Natural Computation (ICNC) - Shenyang, China (2013.07.23-2013.07.25)] 2013 Ninth International Conference on Natural Computation (ICNC) -

375

there are three following operations: mutation, crossover, and selection, which are described as follows.

Mutation--For each vector Xi,G at Generation G, a mutant vector V is generated by

( ), 1, 2, 3,i G i G i G i GV X F X X= + ⋅ − (1)

where i=1,2,…, Np, and i1, i2, and i3 are mutually different random integer indices within [1, Np]. F �(0,2] is a real number that controls the amplification of the difference vector ( )2, 3,i G i GX X− .

Crossover--Like genetic algorithms, DE also employs a crossover operator to build trial vectors Ui,G by recombining two different vectors. In this paper, we use the rand/1/exp strategy to generate the trial vectors.

Selection--A greedy selection mechanism is used as follows:

( ) ( ), if, Otherwise

i i ii

i

U f U f XX

X≤⎧⎪= ⎨

⎪⎩ (2)

Without loss of generality, this paper only considers minimization problem. If, and only if, the trial vector Ui,G is better than Xi,G, then Xi,G is set to Ui,G; otherwise, the Xi,G remains unchanged.

III. GAUSSIAN BARE-BONES DE Recently, Kennedy [18] proposed a barebones PSO

(BBPSO). This new version of PSO eliminates the velocity term, and the position is updated as follows.

,,, 1 ,

2i G

i Gi Ggbest pbest

X N gbest pbest++⎛ ⎞

= −⎜ ⎟⎝ ⎠

(3)

where Xi,G is the position of the ith particle in the population, and N() represents a Gaussian distribution with mean

,

2i Ggbest pbest+

and standard deviation ,i Ggbest pbest− .

From the search behavior of barebones PSO, the Gaussian sampling is a fine tuning procedure which starts during exploration and is continued to exploitation. This can be beneficial for the search of many evolutionary optimization algorithms. Based on this idea, a parameter-free DE algorithm, called Gaussian barebones DE (GBDE), is proposed [7]. In the GBDE, a Gaussian mutation strategy is defined by

, ,, ,, ,

2best G i G

best G i Gi GX X

V N X X+⎛ ⎞= −⎜ ⎟

⎝ ⎠ (4)

Like the classical DE, GBDE also employs the same crossover scheme (binary or exponential). To avoid manually setting the crossover rate CR, a dynamic mechanism is proposed as follows.

( ) ( ), , ,, 1

, if

(0.5,0.1), Otherwisei G i G i G

i G

CR f U f XCR

N+

⎧ ≤⎪= ⎨⎪⎩

(5)

where N(0.5, 0.1) is a random value generated by a normal distribution with mean 0.5 and standard deviation 0.1.

The main steps of the GBDE are described in Algorithm 1, where Xi,G is ith vector in the population at generation G, Vi,G is the mutant vector of Xi,G, Ui,G is the trail vector, CRi,G is the crossover rate of the ith vector generation G, FE is the number of fitness evaluations, and MAX_FEs is the maximum number of fitness evaluations.

Algorithm 1: GBDE Begin

While FE < MAX_FEs do For each vector Xi,G do

Generate a mutant vector Vi,G according to Eq.4; Generate a trial vector Ui,G according to the

exponential crossover. Calculate the fitness of Ui,G; FE++; Select a fitter one between Xi,G and Ui,G as new Xi,G; Update CRi,G according to Eq.5. End For G=G+1;

End While End

IV. EXPERIMENTAL STUDY

A. Benchmark Functions There are nineteen scalable functions used in the

experiments. These functions were considered in ISDA 2009 Workshop on Evolutionary Algorithms and other Metaheuristics for Continuous Optimization Problems-A Scalability Test [19]. The details of these functions can be found in [19].

B. Computational Results Experiments were conducted on the nineteen functions with

D=100, 200, 500, and 1000. Results of GBDE are compared with the classical DE, Crossgenerational elitist selection, Heterogeneous recombination, and Cataclysmic mutation (CHC) [20], and Restart Covariant Matrix Evolutionary Strategy (G-CAM-ES) [21].

For DE and GBDE, the population size is set to 60. The parameters F and CR in DE are set to 0.5 and 0.9, respectively. For CHC and G-CMA-ES, we use the same parameter settings as described in [19]. In the experiments, each algorithm is run 25 times for each test function. Each run stops when the maximum number of fitness (MAX_FEs) evaluations is achieved. The MAX_FEs is set to 5000*D. The mean error values found in the 25 runs are reported.

Results of DE, CHC, G-CMA-ES, and GBDE for D=100, 200, 500, and 1000 are listed in Table I, II, III, and IV, respectively, where “Mean” represents the mean error values over 25 runs. The best results among the comparison are shown in boldface. Fig. 1 presents the convergence curves of GBDE on four problems with D=1000.

For D=100, GBDE outperforms DE on 16 functions. On function F3, both DE and GBDE achieve the same result. For function F13 and F17, DE performs better than GBDE. GBDE surpasses CHC on functions. G-CMA-ES searches better solutions than GBDE on F2, F3, and F8, while GBDE

Page 3: [IEEE 2013 9th International Conference on Natural Computation (ICNC) - Shenyang, China (2013.07.23-2013.07.25)] 2013 Ninth International Conference on Natural Computation (ICNC) -

376

outperforms G-CMA-ES on the rest 16 functions. For D=200, DE outperforms GBDE on F5 and F14, besides F13 and F17. Like D=100, GBDE achieves more accurate solutions than

CHC. G-CMA-ES still works better on F2, F3, and F8, while it shows worse performance than other algorithms except for F5.

F1 F2

F4 F18

Figure 1. The convergence curves of GBDE on four problems with D=1000.

TABLE I. RESULTS ACHIEVED BY THE FOUR ALGORITHM WHEN D=100.

F DE CHC G-CMA-ES GBDE Mean Mean Mean Mean

F1 8.10E-17 3.56E-11 5.55E-17 0.00E+00F2 4.45E+00 8.58E+01 1.51E-10 1.41E+00F3 8.01E+01 4.19E+06 3.88E+00 8.01E+01F4 7.96E-02 2.19E+02 2.50E+02 0.00E+00F5 2.39E-17 3.83E-03 1.58E-03 0.00E+00F6 3.10E-13 4.10E-07 2.12E+01 2.31E-14F7 1.10E-15 7.73E-05 2.43E-05 4.77E-18F8 4.09E+02 1.03E+08 9.86E-21 5.37E+01F9 2.03E-05 1.86E+01 5.40E+01 3.89E-06

F10 1.36E-28 1.87E-01 1.73E+01 9.35E-34F11 1.34E-04 2.42E+01 1.72E+02 5.02E-06F12 5.99E-11 1.03E+01 4.17E+02 1.51E-12F13 6.17E+01 2.70E+06 4.21E+02 6.27E+01F14 4.79E-02 1.66E+02 2.55E+02 5.32E-09F15 9.66E-17 5.18E-02 9.69E-01 2.50E-19F16 3.58E-09 2.23E+01 8.59E+02 7.23E-11F17 1.23E+01 1.47E+05 1.51E+03 1.31E+01F18 1.19E-01 7.00E+01 3.07E+02 2.79E-07F19 1.12E-21 1.26E-01 1.04E+01 2.99E-25

TABLE II. RESULTS ACHIEVED BY THE FOUR ALGORITHM WHEN D=200.

F DE CHC G-CMA-ES GBDE Mean Mean Mean Mean

F1 1.78E-16 8.34E-01 1.12E-16 0.00E+00F2 1.92E+01 1.03E+02 1.16E-09 1.19E+01F3 1.78E+02 2.01E+07 8.91E+01 1.78E+02F4 1.27E-01 5.40E+02 6.48E+02 0.00E+00F5 7.38E-17 8.76E-03 5.27E-17 1.11E-16F6 6.54E-13 1.23E+00 2.14E+01 4.80E-14F7 2.45E-15 1.38E-04 6.32E-02 9.39E-18F8 6.27E+03 2.58E+09 7.68E-20 1.79E+03F9 4.39E-05 2.07E+02 2.49E+02 9.69E-06

F10 3.16E-28 5.49E+00 4.31E+01 4.58E-33F11 2.64E-04 2.13E+02 8.01E+02 9.90E-06F12 9.76E-10 7.44E+01 9.06E+02 2.39E-11F13 1.36E+02 5.75E+06 9.43E+02 1.37E+02F14 1.38E-01 4.29E+02 6.09E+02 3.32E-01F15 5.62E-16 9.42E-01 1.95E+00 1.89E-18F16 7.46E-09 1.60E+02 1.92E+03 1.43E-10F17 3.70E+01 1.75E+05 3.36E+03 3.79E+01F18 5.29E-02 2.12E+02 6.89E+02 5.54E-07F19 2.84E-20 4.33E+00 2.53E+01 1.95E-23

Page 4: [IEEE 2013 9th International Conference on Natural Computation (ICNC) - Shenyang, China (2013.07.23-2013.07.25)] 2013 Ninth International Conference on Natural Computation (ICNC) -

377

TABLE III. RESULTS ACHIEVED BY THE FOUR ALGORITHM WHEN D=500.

F DE CHC G-CMA-ES GBDE Mean Mean Mean Mean

F1 5.17E-16 2.84E-12 2.72E-16 0.00E+00F2 5.35E+01 1.29E+02 3.48E-04 1.19E+01F3 4.76E+02 1.14E+06 3.58E+02 1.78E+02F4 3.20E-01 1.91E+03 2.10E+03 0.00E+00F5 2.39E-16 6.98E-03 2.96E-04 1.11E-16F6 1.65E-12 5.16E+00 2.15E+01 4.80E-14F7 6.20E-15 3.28E-04 5.90E+154 9.39E-18F8 6.84E+04 6.43E+10 8.54E-10 1.79E+03F9 1.15E-04 1.40E+03 1.29E+03 9.69E-06

F10 8.45E-28 2.80E+01 1.35E+02 4.58E-33F11 7.07E-04 1.36E+03 4.12E+03 9.90E-06F12 7.07E-09 4.48E+02 2.58E+03 2.39E-11F13 3.59E+02 3.22E+07 2.87E+03 1.37E+02F14 1.35E-01 1.46E+03 1.95E+03 3.32E-01F15 2.46E-15 1.16E+01 4.24E+256 1.89E-18F16 2.04E-08 9.55E+02 5.45E+03 1.43E-10F17 1.11E+02 8.40E+05 9.59E+03 3.79E+01F18 3.77E-02 7.32E+02 2.05E+03 5.54E-07F19 3.15E-19 4.10E+01 8.30E+01 1.95E-23

TABLE IV. RESULTS ACHIEVED BY THE FOUR ALGORITHM WHEN D=1000.

F DE CHC GBDE Mean Mean Mean

F1 1.08E-15 1.36E-11 0.00E+00F2 8.46E+01 1.44E+02 7.98E+01F3 9.69E+02 8.75E+03 9.69E+02F4 1.44E+00 4.76E+03 9.95E-01F5 5.21E-16 7.02E-03 2.00E-15F6 3.29E-12 1.38E+01 2.60E-13F7 1.19E-14 6.57E-04 4.32E-16F8 2.75E+05 3.85E+11 1.77E+05F9 2.33E-04 3.89E+03 5.18E-05

F10 1.79E-27 3.77E+01 2.57E-32F11 1.40E-03 3.99E+03 5.16E-05F12 1.68E-08 1.05E+03 3.28E-10F13 7.30E+02 6.66E+07 7.30E+02F14 6.90E-01 3.62E+03 3.30E-01F15 5.64E-15 3.70E+01 1.78E-17F16 4.19E-08 2.32E+03 8.69E-10F17 2.36E+02 2.04E+07 2.36E+02F18 1.99E-01 1.72E+03 3.61E-06F19 8.85E-19 7.36E+01 8.61E-22

With the growth of dimensionality, the performance of DE is seriously affected, while GBDE still achieves promising solutions on most functions. For D=500, GBDE outperforms DE on 18 functions, while DE is slightly better than GBDE on F14. For G-CMA-ES, it only performs better than other functions on F2 and F8. On F3, its performance deteriorates quickly with increasing of dimensions.

V. CONCLUSION Gaussian bare-bones DE (GBDE) is a new DE variant

which has shown a good performance on many low-dimensional optimization problems. In this paper, we present a scalability test of GBDE on high-dimensional optimization problems. Experiments are conducted on nineteen benchmark functions with D=100, 200, 500, and 1000. Computational results show that GBDE achieves better solutions than the classical DE, CHC, and G-CMA-ES on the majority of test functions.

ACKNOWLEDGMENT This work is supported by the Foundation of State Key

Laboratory of Software Engineering (No.SKLSE2012-09-19), the Humanity and Social Science Foundation of Ministry of Education of China (No.13YJCZH174), the Science and Technology Plan Projects of Jiangxi Provincial Education Department (Nos. GJJ13744, GJJ13745, GJJ13761, and GJJ13762), the Shantou Science and Technology Planning Project (No.150), and the National Natural Science Foundation of China (Nos.61165004 and 61261039).

REFERENCES

[1] R. Storn and K. Price, “Differential evolution--A simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, pp. 341-359, 1997.

[2] J. Brest, S. Greiner, B. Bošković, M. Mernik, and V. Zumer, “Self-adapting control parameters in differential evolution: A comparative study on numerical benchmark problems,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 6, pp. 646-657, 2006.

[3] J. Liu and J. Lampinen, “A fuzzy adaptive differential evolution algorithm,” Soft Computing, vol.9, no. 6, pp. 448-462, 2005.

[4] A. K. Qin, V. L. Huang, and P. N. Suganthan, “Differential evolution algorithm with strategy adaption for global numerical optimization,” IEEE Transactions on Evolutionary Computation, vol. 13, no. 2, pp. 398-417, 2009.

[5] J. Zhang and A. C. Sanderson, “JADE: adaptive differential evolution with optional external archive,” IEEE Transactions on Evolutionary Computation, vol. 13, no. 5, pp. 945-958, 2009.

[6] S. Das, A. Konar, U. K. Chakraborty, A. Abraham, “Differential evolution with a neighborhood based mutation operator: a comparative study,” IEEE Transactions on Evolutionary Computation, vol. 13, no. 3, pp. 526-553, 2009.

[7] H. Wang, S. Rahnamayan, H. Sun, M.G.H. Omran, “Gaussian bare-bones differential evolution,” IEEE Transactions on Cybernetics, vol. 43, no. 2, pp. 634-647, 2013.

[8] K. Tang, X. Yao, P. N. Suganthan C. Macnish, Y. Chen, C. Chen, Z. Yang, “Benchmark functions for the CEC 2008 special session and competition on high-dimensional real-parameter optimization,” Technical report, Nature Inspired Computation and Applications Laboratory, USTC, China, 2007. http://nical.ustc.edu.cn/cec08ss.php.

[9] Z. Yang, K. Tang, X. Yao, “Multilevel cooperative coevolution for large scale optimization,” Proceedings of IEEE Congress on Evolutioanry Computation, 2008, pp. 1663-1670.

[10] J. Brest, A. Zamuda, B. Bošković, M. S. Maučec, V. Zumer, “High-dimensional real-parameter optimization using self-adaptive differential evolution algorithm with population size reduction” Proceedings of IEEE Congress on Evolutioanry Computation, 2008, pp. 2032-2039.

[11] J. Brest, M.S. Maučec, “Self-adaptive differential evolution algorithm using population size reduction and three strategies,” Soft Computing, vol. 15, no. 11, pp. 2157-2174, 2011.

Page 5: [IEEE 2013 9th International Conference on Natural Computation (ICNC) - Shenyang, China (2013.07.23-2013.07.25)] 2013 Ninth International Conference on Natural Computation (ICNC) -

378

[12] S. Rahnamayan, H. R. Tizhoosh, and M. M. A. Salama, “Opposition-based differential evolution,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 1 pp. 64--79, 2008.

[13] S. Rahnamayan, G. Gary Wang, “Solving large scale optimization problems by opposition-based differential evolution (ODE),” World Scientific and Engineering Academy and Society, Transactions on Computers, vol. 7, no. 10, pp. 1792-1804, 2008.

[14] S. Muelas, A. LaTorre and J. Pena, “A memetic differential evolution algorithm for continuous optimization,” Proceedings of International Conference on Intelligent System Design and Applications, 2009, pp. 1080-1084.

[15] H. Wang, Z. J. Wu, S. Rahnamayan, “Enhanced opposition-based differential evolution for solving high-dimensional continuous optimization problems,” Soft Computing, vol. 15, no. 11, pp. 2127-2140, 2011.

[16] S. Z. Zhao, P. N. Suganthan, S. Das, “Self-adaptive differential evolution with multi-trajectory search for large scale optimization,” Soft Computing, vol. 15, no. 11, 2175-2185, 2011.

[17] H. Wang, Z. Wu and S. Rahnamayan and D. Jiang, “Sequential differential evolution enhanced by neighborhood search for large scale global optimization,” Proceedings of IEEE Congress on Evolutioanry Computation, 2010, pp. 4056-4062.

[18] J. Kennedy, “Bare bones particle swarms,” Proceedomgs of IEEE Swarm Intelligence Symposium, 2003, pp. 80-87.

[19] F. Herrera and M. Lozano, Workshop for evolutionary algorithms and other metaheuristics for continuous optimization problems—A scalability test, Proceedings of International Conference on Intelligent System Design and Applications, Pisa, Italy, 2009.

[20] L. J. Eshelman, “The CHC adaptive search algorithm: how to have safe search when engaging in nontraditional genetic recombination,” Foundations of Genetic Algorithms, vol. 1, pp. 265-283, 1991.

[21] A. Auger and N. Hansen, “A restart CMA evolution strategy with increasing population size,” Proceedings of IEEE Congress on Evolutioanry Computation, 2005, pp 1769-1776.