hybrid genetic grey wolf algorithm for large-scale global...

19
Research Article Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global Optimization Qinghua Gu , Xuexian Li , and Song Jiang School of Management, Xi’an University of Architecture and Technology, Shaanxi 710055, China Correspondence should be addressed to Song Jiang; [email protected] Received 18 October 2018; Revised 26 December 2018; Accepted 22 January 2019; Published 12 February 2019 Guest Editor: Higinio Mora Copyright © 2019 Qinghua Gu et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Most real-world optimization problems tackle a large number of decision variables, known as Large-Scale Global Optimization (LSGO) problems. In general, the metaheuristic algorithms for solving such problems oſten suffer from the “curse of dimensionality.” In order to improve the disadvantage of Grey Wolf Optimizer when solving the LSGO problems, three genetic operators are embedded into the standard GWO and a Hybrid Genetic Grey Wolf Algorithm (HGGWA) is proposed. Firstly, the whole population using Opposition-Based Learning strategy is initialized. Secondly, the selection operation is performed by combining elite reservation strategy. en, the whole population is divided into several subpopulations for cross-operation based on dimensionality reduction and population partition in order to increase the diversity of the population. Finally, the elite individuals in the population are mutated to prevent the algorithm from falling into local optimum. e performance of HGGWA is verified by ten benchmark functions, and the optimization results are compared with WOA, SSA, and ALO. On CEC’2008 LSGO problems, the performance of HGGWA is compared against several state-of-the-art algorithms, CCPSO2, DEwSAcc, MLCC, and EPUS-PSO. Simulation results show that the HGGWA has been greatly improved in convergence accuracy, which proves the effectiveness of HGGWA in solving LSGO problems. 1. Introduction Large-Scale Global Optimization (LSGO) is widely applied in practical engineering problems, such as large-scale job shop scheduling problem [1], large-scale vehicle routing problems [2], and reactive power optimization of large-scale power system [3]. It is a very important and challenging task in the optimization domain and usually involves thousands of decision variables. As the number of dimensions increases, the complexity of the problem increases exponentially, so it is very difficult to solve. Metaheuristic algorithms are a class of intelligent optimization algorithms inspired by biological activities or physical principles. In recent years, various metaheuristic algorithms such as genetic algorithms, particle swarm optimization algorithms, artificial bee colony algorithms [4], and differential evolution algorithms [5] have been applied to a variety of large-scale global optimization problems. However, due to the existence of “curse of dimen- sionality”, it is difficult for general metaheuristic algorithms to find the optimal solution of LSGO problems. In order to solve LSGO problems better, many valuable attempts have been made in recent years by using metaheuristic algorithms. In general, solving large-scale problems can be considered in two ways. First, from the perspective of the problem itself, the solution process can be made more efficient by simplifying the problem. Secondly, from the perspective of the method, the optimal solution to the problem can be solved with greater possibility by improving the performance of the solution algorithm. First of all, in terms of the problem itself, a large number of decision variables are the main cause of the complexity of the problem [6]. In the famous book A Discourse on Method [7], Descartes pointed out that it is necessary to study complex problems and decompose them into a number of relatively simple small problems and then solve them one by one. He called it a “divide and conquer” strategy, which is to solve the whole problem by decomposing the original large-scale problem into a set of smaller and simpler subproblems which are more manageable and easier to solve and then solve each subproblem independently. is method based on “divide and conquer” strategy is Hindawi Complexity Volume 2019, Article ID 2653512, 18 pages https://doi.org/10.1155/2019/2653512

Upload: others

Post on 17-May-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

Research ArticleHybrid Genetic Grey Wolf Algorithm for Large-ScaleGlobal Optimization

Qinghua Gu Xuexian Li and Song Jiang

School of Management Xirsquoan University of Architecture and Technology Shaanxi 710055 China

Correspondence should be addressed to Song Jiang jiangsong925163com

Received 18 October 2018 Revised 26 December 2018 Accepted 22 January 2019 Published 12 February 2019

Guest Editor Higinio Mora

Copyright copy 2019 Qinghua Gu et al This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

Most real-world optimization problems tackle a large number of decision variables known as Large-Scale Global Optimization(LSGO) problems In general the metaheuristic algorithms for solving such problems often suffer from the ldquocurse ofdimensionalityrdquo In order to improve the disadvantage of Grey Wolf Optimizer when solving the LSGO problems three geneticoperators are embedded into the standard GWO and a Hybrid Genetic Grey Wolf Algorithm (HGGWA) is proposed Firstlythe whole population using Opposition-Based Learning strategy is initialized Secondly the selection operation is performed bycombining elite reservation strategyThen thewhole population is divided into several subpopulations for cross-operation based ondimensionality reduction and population partition in order to increase the diversity of the population Finally the elite individualsin the population are mutated to prevent the algorithm from falling into local optimumThe performance of HGGWA is verified byten benchmark functions and the optimization results are compared with WOA SSA and ALO On CECrsquo2008 LSGO problemsthe performance of HGGWA is compared against several state-of-the-art algorithms CCPSO2 DEwSAcc MLCC and EPUS-PSOSimulation results show that the HGGWA has been greatly improved in convergence accuracy which proves the effectiveness ofHGGWA in solving LSGO problems

1 Introduction

Large-Scale Global Optimization (LSGO) is widely applied inpractical engineering problems such as large-scale job shopscheduling problem [1] large-scale vehicle routing problems[2] and reactive power optimization of large-scale powersystem [3] It is a very important and challenging task inthe optimization domain and usually involves thousands ofdecision variables As the number of dimensions increasesthe complexity of the problem increases exponentially soit is very difficult to solve Metaheuristic algorithms area class of intelligent optimization algorithms inspired bybiological activities or physical principles In recent yearsvarious metaheuristic algorithms such as genetic algorithmsparticle swarm optimization algorithms artificial bee colonyalgorithms [4] and differential evolution algorithms [5] havebeen applied to a variety of large-scale global optimizationproblems However due to the existence of ldquocurse of dimen-sionalityrdquo it is difficult for general metaheuristic algorithmsto find the optimal solution of LSGO problems In order to

solve LSGO problems better many valuable attempts havebeen made in recent years by using metaheuristic algorithms

In general solving large-scale problems can be consideredin two ways First from the perspective of the problemitself the solution process can be made more efficient bysimplifying the problem Secondly from the perspective ofthemethod the optimal solution to the problem canbe solvedwith greater possibility by improving the performance of thesolution algorithm First of all in terms of the problem itselfa large number of decision variables are the main cause ofthe complexity of the problem [6] In the famous book ADiscourse on Method [7] Descartes pointed out that it isnecessary to study complex problems and decompose theminto a number of relatively simple small problems and thensolve them one by one He called it a ldquodivide and conquerrdquostrategy which is to solve the whole problem by decomposingthe original large-scale problem into a set of smaller andsimpler subproblems which are more manageable and easierto solve and then solve each subproblem independentlyThis method based on ldquodivide and conquerrdquo strategy is

HindawiComplexityVolume 2019 Article ID 2653512 18 pageshttpsdoiorg10115520192653512

2 Complexity

also called Cooperative Coevolution (CC) and its effec-tiveness for solving large-scale optimization problems hasbeen demonstrated in many classical optimization methodsBut a major difficulty in applying CC is the choice of agood decomposition strategy because different decompo-sition strategies may lead to different optimization effectsDue to the diversity of problem-solving there may be verylarge differences between different problems Therefore it isnecessary to study the structure inherent in the problem andanalyze the relationship between the variables in order to finda suitable solution

In addition from an algorithmic perspective two ormoredistinct methods when combined together in a synergisticmanner can enhance the problem-solving capability of thederived hybrid [8] EAs hybridized with local search algo-rithms have been successful within function optimizationdomain These kinds of EAs are often named as memeticalgorithms (MAs) [9] The optimization performance of thealgorithm for large-scale optimization problems is improvedby different optimization strategies such as designing newmutation operators dynamic neighborhood search strate-gies multiple classifier [10] and Opposition-Based Learn-ing [11] Similar to evolutionary computation the swarmintelligence (SI) is a kind of optimization algorithm inspiredby the biological foraging or hunting behavior in naturewhich simulates the intelligent behavior of insects birdgroups ant colonies or fish schools Inspired by the greywolf population hierarchy and hunting behavior Mirjaliliproposed another swarm intelligence optimization algo-rithm grey wolf optimization algorithm (GWO) in 2014[12] The GWO algorithm has the advantages of less controlparameters and fast convergence and it has been applied tovarious optimization fields such as medical image fusion[13] multiple input multiple output power systems [14]and job shop scheduling problem [15] In addition theproposal of different prediction techniques [16] also pro-vides reference for the study of large-scale optimizationproblems

To direct at the large-scale global optimization problemsthis paper improves the HGGWA algorithm proposed inliterature [17] by adding a parameter nonlinear adjustmentstrategy which further improves the global convergence andsolution accuracy The crossover operation of the HGGWAalgorithm divides the whole population into several subpop-ulations by referring to the idea of cooperative coevolutionand then independently evolves each subpopulation sepa-rately More specifically this paper has the following researchobjectives(1) To review the current literature on solving large-scaleglobal optimization problems and to analyze the existingproblems in the research(2) To improve the Hybrid Genetic Grey Wolf Algorithm(HGGWA) by adding the nonlinear adjustment strategy ofparameter which further improves the global convergenceand solution accuracy for solving large-scale global optimiza-tion problems(3) To analyze the global convergence and computationalcomplexity of the improved HGGWA algorithm and toverify the effectiveness of HGGWA for solving large-scale

global optimization problems by several different numericalexperiments

The remainder of this paper is organized as followsin Section 2 a review of Large-Scale Global Optimiza-tion (LSGO) and various solving strategies is given ThenSection 3 briefly describes the Grey Wolf Optimizer (GWO)In Section 4 the principles and steps of the improved algo-rithm (HGGWA) are introduced Section 5 illustrates andanalyzes the experimental results Finally the conclusions aredrawn in Section 6

2 Literature Review

21 Decomposition-Based CC Strategy Large-scale globaloptimization problems aremainly solved in the following twoways One is Cooperative Coevolution (CC) with problemdecomposition strategy The idea of decomposition was firstproposed by Potter and De Jong [18 19] in 1994 Theydesigned two Cooperative Coevolutionary Genetic Algo-rithms (CCGA-1 CCGA-2) to improve the global conver-gence of GA The CC strategy is to reduce the scale of thehigh-dimensional problem by dividing the whole solutionspace into multiple subspaces and then use multiple sub-populations to achieve the cooperative coevolution Table 1summarizes proposed major variants of the CC method Itcan be divided into the static grouping-based CC methodsand the dynamic grouping-based CCmethods

The static grouping-based CC methods divide the popu-lation into multiple fixed-scale subpopulations The Cooper-ative Coevolutionary Genetic Algorithm (CCGA) proposedby Potter and De Jong is the first attempt to combinethe idea of Cooperative Coevolution with metaheuristicalgorithmThey decomposed an n-dimensional problem inton 1-dimensional subproblems each of which is optimized bya separated GA Van den Bergh et al [20] firstly proposedtwo improved PSO by integrating cooperative coevolutionstrategy into PSO called CPSO-SK and CPSO-HK whichdivides an n-dimensional problem into k s-dimensionalsubproblems Similarly Mohammed EI-Abd [21] presentedtwo cooperative coevolutionary ABC algorithms namelyCABC-S and CABC-H Shi et al [22] applied the cooperativecoevolution strategy to the differential evolution algorithmcalled the cooperative coevolutionary differential evolution(CCDE) The algorithm partitions high-dimensional solu-tion space into two equal-sized subcomponents but it doesnot perform well as the dimensionality increases For afully separable problem with no interrelationship betweenvariables each variable can be optimized independently toobtain the optimal solution for the entire problem Howeverfor the nonseparable problem of the relationship betweenvariables the optimization effect of the fixed-scale groupingstrategy will be worse and even the optimal solution willnot be obtained Therefore many scholars began to studythe decomposition strategy that can solve the nonseparableproblems

The dynamic grouping-based CC methods dynamicallyadjust the size of the subpopulation to accommodate differenttypes of large-scale optimization problems They are efficientto handle nonseparable problems Yang et al [23] proposed

Complexity 3

Table1Asummaryof

major

varia

ntso

fthe

CCmetho

d

Year

Author

Algorith

mYear

Author

Algorith

m1994

Potte

rand

DeJon

g[18]

CCGA-

1CCGA-

22011

Omidvare

tal[29]

CBCC

1996

RStorn[30]

CCD

E2012

LiandYao[31]

CCP

SO2

1999

WeickerK

etal[32]

CCL

VI

2012

Sayedetal[33]

HDIM

A2001

Liuetal[34]

FEPC

C2012

Sayedetal[35]

DIM

A2004

VandenBe

rghetal[20]

CPSO

-SK

CPSO

-HK

2013

RenandWu[36]

CCOABC

2005

Shietal[22]

CCD

E2013

LiuandTang

[37]

CC-

CMA-

ES2008

Yang

etal[23]

DEC

C-G

2014

Omidvare

tal[28]

DEC

C-DG

2008

Yang

etal[38]

MLC

C2014

Omidvare

tal[39]

MLS

oft2009

Lietal[24]

CCP

SO2014

Mahdavietal[40]

DM-H

DMR

2009

Rayetal[26]

CCE

A-AV

P2015

Kazimipou

retal[41]

CBCC

2010

Omidvare

tal[25]

DEC

C-ML

2015

Cao

etal[42]

CC-

GLS

2010

Chen

etal[27]

CCV

IL2016

Peng

etal[43]

mCCE

A2010

Sing

hetal[44

]CCE

A-AV

P2017

Peng

etal[45]

MMO-C

C2010

Moh

ammed

El-Abd

[21]

CABC

-SC

ABC

-H2017

Yang

etal[46

]CCF

R2011

Omidvare

tal[47]

DEC

C-D

2018

Peng

etal[48]

CC-

SMP

4 Complexity

a new cooperative coevolution framework that is capableof optimizing large-scale nonseparable problems It uses arandom grouping strategy to divide the problem solutionspace into multiple subspaces of different sizes Li et al[24] embedded a random grouping strategy and an adaptiveweighting mechanism into the particle swarm optimizationalgorithm and proposed a cooperative coevolutionary parti-cle swarm optimization (CCPSO) algorithm The results ofthe randomgroupingmethod show the effective performanceto solve the scalable nonseparable benchmark function (upto 1000D) However as the number of interaction variablesincreases its performance becomes invalid [25] Thereforesome scholars consider adding a priori knowledge of theproblem in the process of algorithm evolution and proposemany learning-based dynamic grouping strategies Ray etal [26] discussed the effects of problem size the numberof subpopulations and the number of iterations on someseparable and nonseparable problems in cooperative coevo-lutionary algorithms Then a Cooperative CoevolutionaryAlgorithm with Correlation based Adaptive Variable Par-titioning (CCEA-AVP) is proposed to deal with scalablenonseparable problems Chen et al [27] proposed a CCmethodwithVariable Interaction Learning (CCVIL) which isable to adaptively change group sizes Considering that thereis no prior knowledge that cannot decompose the problemOmidvar et al [28] proposed an automatic decompositionstrategy which can keep the correlation between decisionvariables at the minimum

22 Nondecomposition Strategy The other is nondecompo-sition methods Different from the decomposition-basedstrategy the nondecomposition methods mainly study thealgorithm itself and improve the corresponding opera-tor in order to improve the performance of the algo-rithm for solving large-scale global optimization problemsNondecomposition-based methods mainly include swarmintelligence evolutionary computation and local search-based approaches Hsieh et al [49] added variable particlesand information sharing mechanism to the basic particleswarm optimization algorithm and proposed a variationcalled the Efficient PopulationUtilization Strategy for ParticleSwarm Optimizer (EPUS-PSO) A modified PSO algorithmwith a new velocity updating method as the rotated particleswarm was proposed in literature [50] It transforms coordi-nates and uses information from other dimensions to main-tain the diversity of each dimension Fan et al [51] proposeda novel particle swarm optimization approach with dynamicneighborhood that is based on kernel fuzzy clustering andvariable trust region methods (called FT-DNPSO) for large-scale optimization In terms of evolutionary computationHedar et al [52] modified genetic algorithm with newstrategies of population partitioning and space reductionfor high-dimensional problems In order to improve theperformance of differential evolution algorithm Zhang et al[53] proposed an adaptive differential evolution algorithm(called JADE) using a new mutation and adaptive controlparameter strategy Local search-based approaches have beenrecognized as an effective algorithm framework for solvingoptimization problems Hvattum et al [54] introduced a new

direct search method based on Scatter Search designed toremedy the lack of a good derivative-free method for solvingproblems of high dimensions Liu et al [55] improved thelocal search depth parameter in the memetic algorithm andproposed an adaptive local search depth (ALSD) strategywhich can arrange the local search computing resourcesaccording to its performance dynamically

23 Related Work of GWO Grey Wolf Optimizer (GWO) isa recently proposed swarm intelligence algorithm inspired bythe social hierarchy and hunting behavior of wolves It has theadvantages of less control parameters high solution accuracyand fast convergence speed Compared with other classicalmetaheuristic algorithms such as genetic algorithm (GA)particle swarm optimization (PSO) and gravitational searchalgorithm (GSA) the GWO shows powerful explorationcapability and better convergence characteristics Owingto its simplicity and ease of implementation GWO hasgained significant attention and has been applied in solv-ing many practical optimization problems since its inven-tion

Recently many scholars have applied the GWO to differ-ent optimization problems and also proposed many varietiesin order to improve the performance of GWO Saremi et al[56] introduced a dynamic population updating mechanismin the GWO removing individuals with poor fitness inthe population and regenerating new individuals to replacethem in order to enhance the exploration ability of GWOHowever the diversity of grey population was reduced by thisstrategy which leads to the algorithm being local optimumZhu et al [57] integrated the idea of differential evolutioninto the GWO which prevented the early stagnation ofthe GWO algorithm and accelerated the convergence speedHowever it fails to converge in high-dimensional problemsKumar et al [58] designed a novel variant of GWO fornumerical optimization and engineering design problemsThey utilized the preyweight and astrophysics-based learningconcept to enhance the exploration ability However thismethod also does not show good performance in solvinghigh-dimensional function problems In the application ofthe GWO algorithm Guha et al [59] applied the grey wolfalgorithm to the load control problem in large-scale powersystems In literature [60] a multiobjective discrete grey wolfalgorithm is proposed for the uniqueness of welding jobscheduling problem The results show that the algorithm cansolve the scheduling problem in practical engineering Emaryet al [61] proposed a new binary grey wolf optimizationalgorithm for the selection of the best feature subset

The swarm intelligence optimization algorithm has natu-ral parallelism so it has advantages in solving large-scale opti-mization problems In order to study the ability of grey wolfoptimization algorithm to solve high-dimensional functionsLong et al [62] proposed a nonlinear adjustment strategy forconvergence factors which balanced well the exploration andexploitation of the algorithm But this strategy also does notsolve the high-dimensional function optimization problemwell In 2017 Gupta et al [63] studied the performance ofthe GWO algorithm for solving 100 to 1000-dimensionalfunction optimization problems The results indicate that

Complexity 5

GWO is a powerful nature-inspired optimization algorithmfor large-scale problems except Rosenbrock function

In summary the decomposition-based CC strategy has agood effect in solving the nonseparable problem but it cannotsolve the imbalance problem well and it still needs to beimproved in the accuracy of the solution In addition in themethod based on nondecomposition strategy although theperformance of many algorithms has been greatly improvedbut the research on hybrid algorithms is still rare Thereforethis paper combines three genetic operators of selectioncrossover and mutation with GWO algorithm and proposesa Hybrid Genetic Grey Wolf Algorithm (HGGWA) Themain improvement strategies of the algorithm are as fol-lows (1) initialize the population based on the Opposition-Based Learning strategy (2) the nonlinear adjustment ofparameters effectively balances exploration and exploitationcapabilities while accelerating the convergence speed of thealgorithm (3) select the population through a combinationof elite retention strategy and gambling roulette (4) basedon the method of dimensionality reduction and populationdivision the whole population is divided into multiplesubpopulations for cross-operation to increase the diversityof the population (5) perform mutation operations on eliteindividuals in the population to prevent the algorithm fromfalling into local optimum

3 Grey Wolf Optimization Algorithm

31 Mathematical Models In the GWO algorithm [12] theposition of 120572 wolf 120573 wolf and 120575 wolf is the fittest solutionthe second-best solution and the third-best solution respec-tively And the positions of the 120596 wolves are the remainingcandidate solutions In the whole process of searching forprey in the space the 120596 wolves gradually update theirpositions according to the optimal position of the 120572 120573 and 120575wolves So they can gradually approach and capture the preythat is to complete the process of searching for the optimalsolution The position updating imaginary diagram in GWOis shown in Figure 1

The grey wolves position update formula in this processis as follows 997888rarr119863 = 100381610038161003816100381610038161003816997888rarr119862 lowast 997888997888rarr119883119901 (119905) minus 997888rarr119883 (119905)100381610038161003816100381610038161003816 (1)

997888rarr119883 (119905 + 1) = 997888997888rarr119883119901 (119905) minus 997888rarr119860 lowast 997888rarr119863 (2)

where 997888rarr119863 indicates the distance between the grey wolfindividual and the prey 119905 indicates the current iteration 997888rarr119860and 997888rarr119862 are the coefficient vectors 997888rarr119860 is a convergence coef-ficient used to balance the exploration and the exploitation997888rarr119862 is used to simulate the effects of nature 997888997888rarr119883119901 is the positionvector of the prey and 997888rarr119883 indicates the position vector of agrey wolf

The coefficient vectors 997888rarr119860 and 997888rarr119862 are calculated as follows997888rarr119860 = 2997888rarr119886 lowast 997888rarr1199031 minus 997888rarr119886 (3)

997888rarr119862 = 2 lowast 997888rarr1199032 (4)

where components of 997888rarr119886 are linearly decreased from 2to 0 over the course of iterations and 1199031 and 1199032 are randomvectors in [0 1] At the same time the fluctuation range ofthe coefficient vector 997888rarr119860 gradually decreases as 997888rarr119886 decreases

Thus according to (1) and (2) the grey wolf individ-ual randomly moves within the search space to graduallyapproach the optimal solution The specific location updateformula is as follows 997888997888rarr119863120572 = 100381610038161003816100381610038161003816997888rarr1198621 lowast 997888997888rarr119883120572 minus 997888rarr119883100381610038161003816100381610038161003816 (5)

997888997888rarr119863120573 = 1003816100381610038161003816100381610038161003816997888rarr1198622 lowast 997888997888rarr119883120573 minus 997888rarr1198831003816100381610038161003816100381610038161003816 (6)

997888997888rarr119863120575 = 1003816100381610038161003816100381610038161003816997888rarr1198623 lowast 997888997888rarr119883120575 minus 997888rarr1198831003816100381610038161003816100381610038161003816 (7)

997888997888rarr1198831 = 997888997888rarr119883120572 minus 997888rarr1198601 lowast 997888997888rarr119863120572 (8)

997888997888rarr1198832 = 997888997888rarr119883120573 minus 997888rarr1198602 lowast 997888997888rarr119863120573 (9)

997888997888rarr1198833 = 997888997888rarr119883120575 minus 997888rarr1198603 lowast 997888997888rarr119863120575 (10)

997888rarr119883 (119905 + 1) = 997888997888rarr1198831 + 997888997888rarr1198832 + 997888997888rarr11988333 (11)

where997888997888rarr119863120572 997888997888rarr119863120573 997888997888rarr119863120575 indicate the distance between the 120572 120573120575 wolves and the prey respectively 997888997888rarr1198831 997888997888rarr1198832 997888997888rarr1198833 represent thepositional components of the remaining grey wolves basedon the positions of 120572 120573 and 120575 wolves respectively 997888rarr119883(119905 + 1)represents the position vector after the grey wolf is updated

32 Basic Process of GWO

Step 1 Randomly generate an initial population and initial-ize parameters 997888rarr119886 997888rarr119860 and 997888rarr119862 Step 2 Calculate the fitness of the grey wolf individual andsave the positions of the first three individuals with thehighest fitness as 997888rarr119883120572 997888rarr119883120573 and 997888rarr119883120575Step 3 Update the position information of each grey wolfaccording to (5) to (11) thereby obtaining the next generationpopulation and then update the values of parameters 997888rarr119886 997888rarr119860and 997888rarr119862 Step 4 Calculate the fitness of individuals in the new popu-lation and update 997888rarr119883120572 997888rarr119883120573 and 997888rarr119883120575Step 5 Repeat Steps 2ndash4 until the optimal solution isobtained or the maximum number of iterations is reached

4 Hybrid Genetic Grey Wolf Algorithm

This section describes the details of HGGWA the hybridalgorithm proposed in this paper Firstly the initial pop-ulation strategy based on Opposition-Based Learning is

6 Complexity

D

D

D

Move

C1 a1

C2 a2

a3C3

R

δ

or other wolves

The estimated position of the prey

Figure 1 Position updating in GWO

described in detail in Section 41 Secondly Section 42 intro-duces the nonlinear adjustment strategy for parameter 119886Then three operations of selection crossover and mutationare carried out in Sections 43ndash45 respectively Finally thetime complexity of HGGWA is analyzed in Section 46

41 Initial Population Strategy Based on Opposition-BasedLearning For the swarm intelligence optimization algo-rithm based on population iteration the diversity of initialpopulations lays the foundation for the efficiency of thealgorithm The better diversity of the population will reducethe computation time and improve the global convergence ofthe algorithm [64] However like other algorithms the GWOalgorithm also uses random initialization when generatingpopulations which will have a certain impact on the searchefficiency of the algorithm Opposition-Based Learning is astrategy proposed by Tizhoosh [65] to improve the efficiencyof algorithm search It uses the opposite points of known indi-vidual positions to generate new individual positions therebyincreasing the diversity of search populations Currently theOpposition-Based Learning strategy has been successfullyapplied to multiple swarm optimization algorithms [66 67]The opposite point is defined as follows

Assume that there is an individual 119883 (1199091 1199092 119909119863) inthe population P then the opposite point of the element119909119894 (119894 = 1 2 119863) on each dimension is 1199091015840119894 = 119897119894 + 119906119894 minus119909119894 where 119897119894 and 119906119894 are the lower bound and the upperof the 119894th dimension respectively According to the abovedefinition the Opposition-Based Learning strategy initializesthe population as follows

(a) Randomly initialize the population P calculate theopposite point 1199091015840119894 of each individual position 119909119894 and all theopposite points constitute the opposition population P1015840

(b) Calculate the fitness of each individual in the initialpopulation P and the opposition population P1015840 and arrangethem in descending order

(c) Select the top N grey wolf individuals with the highestfitness as the final initial population

42 Parameters Nonlinear Adjustment Strategy It is sig-nificantly important how to balance the exploration andexploitation for swarm intelligence algorithms In the earlyiteration of the algorithm the powerful exploration capabilityis beneficial for the algorithm to expand the search rangeso that the algorithm can search for the optimal solutionwith greater probability In the later stage of the algorithmrsquositeration the effective exploitation ability can speed up theoptimization process of the algorithm and improve the con-vergence accuracy of the final solution Therefore only whenthe swarm optimization algorithm can better coordinate itsglobal exploration and local exploitation capabilities can ithave strong robustness and faster convergence speed

It can be seen from (2) and (3) that the dynamic changeof the parameter 119886 plays a crucial role in the algorithm Inthe basic GWO the linear decrement strategy of parameter119886 does not reflect the actual convergence process of the algo-rithm Therefore the algorithm does not balance the explo-ration and exploitation capabilities well especially when solv-ing large-scale multimodal function problems In order todynamically adjust the global exploration and local exploita-tion process of the algorithm this paper proposes a nonlinearadjustment strategy for the parameter a which is beneficialforHGGWA to search for the optimal solutionThe improvedparameter nonlinear adjustment equation is as follows

119886 = 119886119898119886119909 + (119886119898119894119899 minus 119886119898119886119909) sdot ( 11 + 119890119905119872119886119909119894119905119890119903 )119896

(12)

where 119886119898119886119909and 119886119898119894119899 are the initial value and the end valueof the parameter 119886 119905 is the current iteration index 119872119886119909119894119905119890119903is the maximum number of iterations and 119896 is nonlinearadjustment coefficient

Thus the variation range of the convergence factor Ais controlled by the nonlinear adjustment of the parameter119886 When |A| gt 1 the grey wolf population expandsthe search range to find better prey which corresponds tothe global exploration of the algorithm when |A| lt 1

Complexity 7

the whole population shrinks the search range thus anencirclement is formed around the prey to complete the finalattacking against the prey which corresponds to the localexploitation process of the algorithm The whole process hasa positive effect on balancing global search and local searchwhich is beneficial to improve the accuracy of the solutionand further accelerate the convergence speed of the algo-rithm

43 Selection Operation Based on Optimal Retention Forthe intelligent optimization algorithm based on populationevolution the evolution of each generation of populationdirectly affects the optimization effect of the algorithm Inorder to inherit the superior individuals in the paternalpopulation to the next generation without being destroyedit is necessary to preserve the good individuals directly Theoptimal retention selection strategy is an effective way topreserve good individuals in genetic algorithms This paperintegrates them into the GWO to improve the efficiencyof the algorithm Assuming that the current population isP(1198831 1198832 119883119863) the fitness of the individual 119883119894 is 119865119894 Thespecific operation is as follows

(a) Calculate the fitness 119865119894 (i = 1 2 D) of eachindividual and arrange them in descending order

(b) Selecting the individuals with the highest fitnessto copy directly into the next generation of popula-tion

(c) Calculate the total fitness S of the remaining indi-viduals and the probability 119901119894 that each individual isselected

S = 119863minus1sum119894=1

119865119894 (119894 = 1 2 119863 minus 1) (13)

119901119894 = 119865119894sum119863minus1119894=1 119865119894 (119894 = 1 2 119863 minus 1) (14)

(d) Calculate the cumulative fitness value 119904119894 for eachindividual and then the selection operation is performed inthemanner of the bet roulette until the number of individualsin the children population is consistent with the parentpopulation

119904119894 = sum119894119894=1 119865119894S(119894 = 1 2 119863 minus 1) (15)

44 Crossover Operation Based on Population Partition-ing Mechanism Due to the decrease of population diver-sity in the late evolution the GWO algorithm is easy tofall into local optimum for solving the large-scale high-dimensional optimization problem in order to overcomethe problems caused by large-scale and complexity andto ensure that the algorithm searches for all solutionspaces

In the HGGWA algorithm the whole population P isdivided into 120592 times 120578 subpopulations 119875119894119895 (119894 = 1 sdot sdot sdot 120592 119895 =1 120578) The optimal size of each subpopulation 119875119894119895 wastested to be 5 times 5 Individuals of each subpopulation were

cross-operated to increase the diversity of the populationThespecific division method is shown below

The initial population 119875[[[[[[[

11990911 11990912 sdot sdot sdot 119909111988911990921 11990922 sdot sdot sdot 1199092119889 d1199091198751 1199091198752 sdot sdot sdot 119909119875119889

]]]]]]]Subpopulations 119875119894119895

[[[[[

11990911 sdot sdot sdot 11990915 d11990951 sdot sdot sdot 11990955]]]]]sdot sdot sdot [[[[[

1199091(119889minus4) sdot sdot sdot 1199091119889 d1199095(119889minus4) sdot sdot sdot 1199095119889]]]]]11987511 sdot sdot sdot 1198751120578

[[[[[

119909(119875minus4)1 sdot sdot sdot 119909(119875minus4)5 d1199091198751 sdot sdot sdot 1199091198755

]]]]]sdot sdot sdot [[[[[

119909(119875minus4)(119889minus4) sdot sdot sdot 119909(119875minus4)119889 d119909119875(119889minus4) sdot sdot sdot 119909119875119889

]]]]]119875]1 sdot sdot sdot 119875]120578

(16)

In genetic algorithms cross-operation has a very impor-tant role and is the main way to generate new individuals Inthe improved algorithm of this paper each individual in thesubpopulation is cross-operated in a linear crossover mannerGenerating a corresponding random number 119903119894 isin(01) foreach individual 119909119894 in the subpopulation When the randomnumber 119903119894 is less than the crossover probability 119875119888 thecorresponding individual 119909119894 is paired for cross-operation

An example is as follows Crossover (p1 p2)(a) Generate a random number 120582 isin (0 1)(b) Two children 1198881(11988811 1198881119863) 1198882(11988821 1198882119863) are gener-

ated by two parents 1199011(11990111 1199011119863) and 1199012(11990121 1199012119863)1198881119894 = 1205821199011119894 + (1 minus 120582) 1199012119894 119894 = 1 2 119863 (17)

1198882119894 = 1205821199012119894 + (1 minus 120582) 1199011119894 119894 = 1 2 119863 (18)

45 Mutation Operation for Elite Individuals Due to theexistence of the selection operation based on the optimalpreservation strategy the grey wolf individuals in the wholepopulation are concentrated in a small optimal region inthe later stage of the iterative process which easily leadsto the loss of population diversity If the current optimalindividual is a locally optimal solution then the algorithm iseasy to fall into local optimum especially when solving high-dimensional multimodal functions To this end this paperintroduces the mutation operator in the HGGWA algorithmto perform mutation operations on elite individuals in thepopulation The specific operations are as follows

Assume that the optimal individual is 119909119894(1199091 1199092 119909119889)and the mutation operation is performed on 119909119894 with themutation probability 119875119898 That is select a gene 119909119896 from theoptimal individual with probability 119875119898 instead of the gene 119909119896with a random number between upper and lower bounds to

8 Complexity

(1) Initialize parameters a A C population size N 119875119888 and 119875119898(2) Initialize population 119875 using OBL the strategy(3) t=0(4) While tltMa119909119894119905119890119903(5) Calculate the fitness of all search agents(6) 119883120572= the best search agent(7) 119883120573= the second best search agent(8) 119883120575= the third best search agent(9) for i = 1N(10) Update the position of all search agents by equation (11)(11) end for(12) newP1larr997888P except the119883120572(13) for i = 1N-1(14) Generate newP2 by the Roulette Wheel Selection on newP1(15) end for(16) Plarr997888newP2 and119883120572(17) Generate subPs by the population partitioning mechanism on P(18) Select the individuals by crossover probability 119875119888(19) for i = 1Nlowast119875119888(20) Crossover each search agent by equation (17) and (18)(21) end for(22) Generate the1198831015840120572 1198831015840120573 and1198831015840120575 by equation (19) on mutation probability 119875119898(23) t = t+1(24) end while(25) Return 119883120572Algorithm 1 Pseudo code of the HGGWA

generate a new individual 1199091015840119894 = (11990910158401 11990910158402 1199091015840119889) The specificoperation is as follows

1199091015840119894 = 119897 + 120582 lowast (119906 minus 119897) 119894 = 119896119909119894 119894 = 119896 (19)

where 120582 is a random number in [0 1] and 119897 and 119906 are thelower and upper bounds of the individual 119909119894 respectively46 Pseudo Code of the HGGWA and Time ComplexityAnalysis This section describes how to calculate an upperbound for the total number of fitness evaluations (FE)required by HGGWA As shown in Algorithm 1 the compu-tational complexity of HGGWA in one generation is mainlydominated by the position updating of all search agents inline (10) The position updating requires a time complexityof O(Nlowastdimlowast119872119886119909119894119905119890119903)(N is the population size dim is thedimensions of each benchmark function and 119872119886119909119894119905119890119903 isthe maximum number of iterations of HGGWA) to obtainthe needed parameters and to finish the whole positionupdating progress of all search agents In addition theprocess of crossover operations is another time-consumingstep in line (20) It needs a time complexity of O(2lowast120592 lowast120578 lowast 119872119886119909119894119905119890119903)(120592 lowast 120578 is the total number of subpopulations120592=N5 120578=dim5) to complete the crossing of individuals ineach subpopulations according to the crossover probability119875119888 It should be noted that this is only the worst casecomputational complexity If there are only two individualsin each subpopulation that need to be crossed then theminimum amount of computation is O(120592 lowast 120578 lowast 119872119886119909119894119905119890119903)

Therefore the maximum computational complexity causedby the twodominant processes in the algorithm ismax 119874(Nlowast119889119894119898lowast119872119886119909119894119905119890119903) 119874(2lowast120592lowast120578lowast119872119886119909119894119905119890119903) (120592 = N5 120578 = 1198891198941198985)in one generation ie 119874(N lowast 119889119894119898 lowast119872119886119909119894119905119890119903)5 Numerical Experiments and Analysis

In this section the proposed HGGWA algorithm will beevaluated on both classical benchmark functions [68] andthe suite of test functions provided by CEC2008 SpecialSession on large-scale global optimization The algorithmsused for comparison include not only conventional EAs butalso other CC optimization algorithms Experimental resultsare provided to analyze the performance of HGGWA in thecontext of large-scale optimization problems In addition thesensitivity analysis of nonlinear adjustment coefficient andthe global convergence of HGGWA are discussed in Sections54 and 55 respectively

51 Benchmark Functions and Performance Measures Inorder to verify the ability ofHGGWAalgorithm to solve high-dimensional complex functions 10 general high-dimensionalbenchmark functions (100D 500D 1000D) were selected foroptimization test Among them 1198911 sim 1198916 are unimodalfunctions which are used to test the local searchability ofthe algorithm 1198917 sim 11989110 are multimodal functions which areused to test the global searchability of the algorithm Thesetest functions have multiple local optima points with unevendistribution nonconvexity and strong oscillation which arevery difficult to converge to the global optimal solutionespecially in the case of being high-dimensional The specific

Complexity 9

Table 2 10 high-dimensional benchmark functions

Functions Function formula Range 119891119898119894119899 Accuracy

Sphere 1198911 (119909) = 119889sum119894=1

1199092119894 [minus100 100] 0 1 times 10minus8Schwefelrsquos 222 1198912 (119909) = 119889sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816 + 119889prod119894=1

10038161003816100381610038161199091198941003816100381610038161003816 [minus10 10] 0 1 times 10minus8Schwefelrsquos 221 1198913 (119909) = max11989410038161003816100381610038161199091198941003816100381610038161003816 1 le 119894 le 119889 [minus100 100] 0 1 times 10minus2Rosenbrock 1198914 (119909) = 119889minus1sum

119894=1

[100 (119909119894+1 minus 1199092119894 )2 + (119909119894 minus 1)2] [minus30 30] 0 1 times 100Schwefelrsquos 12 1198915 (119909) = 119889sum

119894=1

( 119894sum119895=1

119909119895)2 [minus100 100] 0 1 times 10minus3Quartic 1198916 (119909) = 119889sum

119894=1

1198941199094119894 + 119903119886119899119889119900119898[0 1) [minus128 128] 0 1 times 10minus4Rastrigin 1198917 (119909) = 119889sum

119894=1

[1199092119894 minus 10 cos (2120587119909119894) + 10] [minus512 512] 0 1 times 10minus8Ackley 1198918 = minus20 exp(minus02radic 1119889

119889sum119894=1

1199092119894) minus exp(1119889119889sum119894=1

cos (2120587119909119894)) + 20 + 119890 [minus32 32] 0 1 times 10minus5Griewank 1198919 = 14000

119889sum119894=1

1199092119894 minus 119889prod119894=1

cos( 119909119894radic119894) + 1 [minus600 600] 0 1 times 10minus5

Penalized 1

11989110 = 120587119889 10 sin (1205871199101) +119889minus1sum119894=1

(119910119894 minus 1)2 [1 + sin2 (120587119910119894+1)] + (119910119899 minus 1)2

[minus50 50] 0 1 times 10minus2+ 119889sum119894=1

119906 (119909119894 10 100 4)119910119894 = 1 + 119909119894 + 14

119906(119909119894 119886 119896119898)

119896(119909119894 minus 119886)119898 119909119894 gt 1198860119896(minus119909119894 minus 119886)119898 119909119894 lt minus119886

characteristics of the 10 benchmark functions are shown inTable 2

In order to show the optimization effect of the algorithmmore clearly this paper selects two indicators to evaluate theperformance of the algorithm [69] The first is the solutionaccuracy (AC) which reflects the difference between theoptimal results of the algorithm and the theoretical optimalvalue In an experiment if the final convergence result ofthe algorithm is less than the AC then the optimization isconsidered successful Assuming that the optimal value of acertain optimization is119883119900119901119905 and the theoretical optimal valueis119883119898119894119899 then the AC is calculated as follows

119860119862 = 10038161003816100381610038161003816119883119900119901119905 minus 11988311989811989411989910038161003816100381610038161003816 (20)

The second is the successful ratio (SR) which reflectsthe proportion of the number of successful optimizationsto the total number of optimizations under the conditionthat the algorithm has a certain accuracy Assume that inthe 119879 iterations of the algorithm the number of successfuloptimizations is 119905 then the SR is calculated as follows

119878119877 = 119905119879 times 100 (21)

52 Results on Classical Benchmark Functions In this sec-tion a set of simulation tests were performed in order toverify the effectiveness of HGGWA Firstly the 10 high-dimensional benchmark functions in Table 2 are optimizedbyHGGWAalgorithm proposed in this paper and the resultsare compared with WOA SSA and ALO In addition therunning time of several algorithms is compared under thegiven convergence accuracy

521 Parameters Settings for the Compared Algorithms In allexperiments the values of the common parameters such asthe maximum number of iterations (119872119886119909119894119905119890119903) the dimensionof the functions (D) and the population sizes (N) were cho-sen the same For all test problems we focus on investigatingthe optimization performance of the proposed method onproblemswithD= 100 500 and 1000Themaximumnumberof iterations is set to 1000 and the population size is set to50 The parameter setting of WOA SSA and ALO is derivedfrom the original literature [70ndash72] The optimal parametersettings of the HGGWA algorithm proposed in this paper areshown in Table 3

For each experiment of an algorithm on a benchmarkfunction 30 independent runs are performed to obtain a

10 Complexity

Table 3 Parameter settings of the HGGWA

Parameters Definition ValueN Population size 50119875119888 Cross probability 08119875119898 Mutation probability 001119872119886119909119894119905119890119903 Maximum number of iterations 1000119896 Nonlinear adjustment coefficient 05

fair comparison among the different algorithms The optimalvalue the worst value the average value and the standarddeviation of each algorithm optimization are recorded andthen the optimization successful ratio (SR) is calculatedAll programs were coded in Matlab 2017b (Win64) andexecuted on a Lenovo computer with Intel (R) Core I5-6300HQ 8G ROM 230GHz under Windows 10 operatingsystem

522 Comparison with State-of-the-Art Metaheuristic Algo-rithms In order to verify the efficiency of the HGGWAalgorithm several state-of-the-art metaheuristic algorithmsrecently proposed are compared with their optimizationresults These algorithms are Whale Optimization Algorithm(WOA) Salp Swarm Algorithm (SSA) and Ant Lion Opti-mization Algorithm (ALO) The test was performed usingthe same 10 high-dimensional functions in Table 2 and theresults of the comparison are shown in Table 4

From Table 4 it can be known that the convergenceaccuracy and optimization successful ratio of the HGGWAalgorithm are significantly higher than the other three algo-rithms for most of the test functions In the case of 100dimensions HGGWA algorithm can converge to the globaloptimal solution for the other nine test functions at one timein addition to function 3 And the algorithm has a successfulratio of 100 for the nine functions under the condition ofcertain convergence accuracy The convergence result of theWOA algorithm is better than the other three algorithms forfunctions 1198911 and 1198912 and its robustness is excellent With theincrease of function dimension the convergence precisionof several algorithms decreases slightly but the optimizationresults of HGGWA algorithm are better than those of WOASSA and ALO In general the HGGWA algorithm exhibitsbetter optimization effects than the WOA SSA and ALOalgorithms in the 100- 500- and 1000-dimensional testfunctions which proves the effectiveness of the HGGWAalgorithm for solving high-dimensional complex functions

Generally speaking the HGGWA algorithm performsbetter than other algorithms in most test functions Inorder to compare the convergence performance of the fouralgorithms more clearly the convergence curves of the fouralgorithms on Sphere Schwefelrsquos 222 Rastrigin GriewankQuartic and Ackley functions with D=100 are shown inFigure 2 Sphere and Schwefelrsquos 222 represent the two mostbasic unimodal functions Rastrigin and Griewank representthe two most basic multimodal functions It is seen thatHGGWA has higher precision and faster convergence speedthan other algorithms

523 Comparative Analysis of Running Time To evaluate theactual runtime of the several compared algorithms includingSSAWOAGWO andHGGWA their average running times(in seconds s) from 30 runs on function 1198911 1198912 1198916 1198917 1198918 1198919and 11989110 are plotted in Figure 3 The convergence accuracy ofeach test function has been described in Table 2

Figure 3 shows the convergence behavior of differentalgorithms Each point on the plot was calculated by takingthe average of 30 independent runs As can be seen fromFigure 3 several algorithms show different effects under thesame calculation accuracy For the functions 1198911 1198912 1198917 11989181198919 the GWO algorithm has the fastest convergence rateas a result of the advantages of the GWO algorithm itselfHowever for the test functions 1198916 and 11989110 the runningtime of the HGGWA algorithm is the shortest Throughfurther analysis it is known that HGGWA algorithm adds alittle computing time compared to GWO algorithm becauseof several improved strategies such as selection crossoverand mutation However the convergence speed of HGGWAalgorithm is still better than SSA and WOA algorithmsOverall the convergence speed of the HGGWA algorithmperforms better in several algorithms compared Moreoverthe running time of different test functions is very small andit is kept between 1s and 3s which shows that the HGGWAalgorithm has excellent stability

53 Results on CECrsquo2008 Benchmark Functions This sectionprovides an analysis of the effectiveness of HGGWA in termsof the CECrsquo2008 large-scale benchmark functions and acomparison with four powerful algorithms which have beenproven to be effective in solving large-scale optimizationproblems Experimental results are provided to analyze theperformance of HGGWA for large-scale optimization prob-lems

We tested our proposed algorithms with CECrsquo2008 func-tions for 100 500 and 1000 dimensions and the means ofthe best fitness value over 50 runs were recorded WhileF1 (Shifted Sphere) F4 (Shifted Rastrigin) and F6 (ShiftedAckley) are separable functions F2 (Schwefel Problem) F3(Shifted Rosenbrock) F5 (Shifted Griewank) and F7 (FastFractal) are nonseparable presenting a greater challenge toany algorithm that is prone to variable interactions Theperformance of HGGWA is compared with other algorithmsCCPSO2 [31] DEwSAcc [5] MLCC [38] and EPUS-PSO[49] for 100 500 and 1000 dimensions respectively Themaximum number of fitness evaluations (FEs) was calculatedby the following formula FEs = 5000timesD where D is thenumber of dimensions In order to reflect the fairness of thecomparison results the optimization results of the other fouralgorithms are directly derived from the original literatureThe specific comparison results are shown in Table 5

Table 5 shows the experimental results and the entriesshown in bold are significantly better than other algorithmsA general trend that can be seen is that HGGWA algorithmhas a promising optimization effect for most of the 7 func-tions

Generally speaking HGGWAoutperforms CCPSO2 on 6out of 7 functions with 100 and 500 dimensions respectivelyAmong them the result of HGGWA is slightly worse than

Complexity 11

Table4Com

paris

onresults

of10

high

-dim

ensio

nalb

enchmarkfunctio

nsby

WOASSA

ALO

and

HGGWA

Functio

nDim

WOA

SSA

ALO

HGGWA

Mean

Std

SRMean

Std

SRMean

Std

SRMean

Std

SR

119891 1100

577

E-96

469

E-96

100

101E-02

124E

-02

0304

E-01

231E-01

0485E-53

241E-52

100

500

724E

-93

298

E-94

100

324E+

04218E+

040

734E

+04

132E

+04

0835E-25

886E-25

100

1000

205

E-89

316E-90

100

119E

+05

104E

+05

0248E+

05212E+

050

175E

-17

273E-17

100

119891 2100

189E

-99

233E-99

100

885E+

00237E+

000

429E+

02325E+

020

401E-33

434E-33

100

500

138E

-99

216E-99

100

337E+

02217E+

020

231E+

03203E+

030

958E

-18

236E-18

100

1000

180E

-99

137E

-98

100

845E+

02343E+

020

325E+

04413E+

040

250E-11

166E

-11

100

119891 3100

384E+

01217E+

000

222E+

0110

5E+0

10

247E+

0113

4E+0

10

455E+

01369E+

010

500

932E

+01

532E+

010

327E+

01233E+

010

392E+

01234E+

010

722E

+01

501E+

000

1000

986E

+01

324E+

010

331E+

0112

8E+0

10

493E+

01399E+

010

831E+

01436E+

000

119891 4100

967E

+01

398E+

010

900E

+02

516E+

020

140E

+03

103E

+03

0682

E+01

125E

-01

100

500

495E+

02314E+

010

718E

+06

146E

+06

0209E+

0713

7E+0

70

367

E+02

823E-01

01000

991E+0

2248E+

020

298E+

0713

5E+0

70

152E

+08

100E

+07

0964

E+02

374E+

010

119891 5100

783E

+05

236E+

050

217E+

0410

3E+0

40

360

E+04

216E+

040

345

E-05

277

E-05

100

500

198E

+07

132E

+07

0429E+

05316E+

050

118E

+06

103E

+06

0267

E-03

438

E-03

80

1000

668E+

07344

E+07

0248E+

0612

3E+0

60

372E+

06349E+

060

833E+

00512E+

000

119891 6100

265E-04

143E

-04

100

112E

+00

103E

+00

0828E-01

268E-01

0234

E-09

317E-10

100

500

394E-04

322E-04

60

497E+

01366

E+01

0113E

+02

101E+0

20

573E-05

433E-06

100

1000

216E-03

136E

-03

0393E+

02278E+

020

177E

+03

233E+

030

108E

-04

162E

-03

80

119891 7100

000

E+00

000

E+00

100

627E+

01436E+

010

354E+

02332E+

020

000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

204

E+03

197E

+03

0334E+

03456E+

030

000

E+00

000

E+00

100

1000

000

E+00

000

E+00

100

569E+

0316

6E+0

30

719E

+03

214E+

030

557E-11

315E-12

100

119891 8100

444

E-15

312E-15

100

515E+

00413E+

000

493E+

00349E+

000

215E-15

348

E-15

100

500

444

E-15

305

E-15

100

126E

+01

100E

+01

012

8E+0

1228E+

010

874E-12

539E-12

100

1000

888

E-16

288

E-15

100

128E

+01

144E

+01

014

6E+0

1203E+

010

159E

-07

163E

-07

100

119891 9100

000

E+00

000

E+00

100

152E

-01

244

E-01

0454E-01

316E-01

0000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

282E+

02213E+

020

392E+

0210

8E+0

20

184E

-16

844

E-17

100

1000

000

E+00

000

E+00

100

110E

+03

134E

+03

0361E+

03117E

+03

0231E-13

329E-14

100

119891 10100

816E-03

432E-03

100

110E

+01

219E+

010

174E

+01

133E

+01

0273E-07

311E-07

100

500

252E-02

199E

-02

80

516E+

01333E+

010

397E+

05296E+

050

812E-05

498

E-06

100

1000

143E

-02

103E

-02

80

100E

+05

213E+

040

696E+

0613

4E+0

60

633E-04

377

E-03

100

12 Complexity

WOASSAALOHGGWA

500 10000Number of iterations

10minus100

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(a) Sphere function

WOASSAALOHGGWA

500 10000Number of iterations

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(b) Schwefel 222 function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(c) Rastrigin function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(d) Griewank function

WOASSAALOHGGWA

200 400 600 800 10000Number of iterations

10minus5

100

105

Obj

ectiv

e fun

ctio

n va

lue

(e) Quartic function

WOASSAALOHGGWA

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

500 10000Number of iterations

(f) Ackley function

Figure 2 Convergence curves of WOA SSA ALO and HGGWA algorithms

Comparative analysis of running time

459

344

053

13

388

239

072

186

537

366

317

285

423

233

122

216

399

217

057

139

259

218

069

121

463

333352

195

0

1

2

3

4

5

6

Runn

ing

Tim

es (s

)

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

SSAWOAGWOHGGWA(a) Comparative analysis of running time

Comparative analysis of iterations

302

233

198221

334290

263237

723

614

491

407 398

258

365312 263

234201

175

214185

166149

186

85

298

132

SSAWOAGWOHGGWA

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

100

200

300

400

500

600

700

Num

ber o

f ite

ratio

ns

(b) Comparative analysis of iterations

Figure 3 The average running times and iterations of SSA WOA GWO and HGGWA on different test functions

Complexity 13

Table5Com

paris

onresults

of7CE

Crsquo200

8benchm

arkfunctio

nsby

fivea

lgorith

ms

Functio

nDim

CCP

SO2

DEw

SAcc

MLC

CEP

US-PS

OHGGWA

Mean

Std

Mean

Std

Mean

Std

Mean

Std

Mean

Std

F1100

773E

-14

323E-14

568E-14

000

E+00

682E-14

232E-14

747E

-01

170E

-01

306

E-17

293E-17

500

300

E-13

796E

-14

209E-09

461E-09

429E-13

331E-14

845E+

01640

E+00

275E-15

213E-15

1000

518E-13

961E-14

878E-03

527E-03

845E-13

500

E-14

553E+

02286E+

0110

6E-10

183E

-11

F2100

608E+

0078

3E+0

0825E+

00532E+

00252E+

01872E+

0018

6E+0

1226E+

00271E+

00239

E+00

500

579E+

01421E+

0175

7E+0

1304

E+00

666

E+01

569E+

00435E+

01551E-01

312E+

01491E+

001000

782E

+01

425E+

0196

0E+0

118

1E+0

010

8E+0

2475E+

00466

E+01

400

E-01

271E+

01566

E+00

F3100

423E+

02865E+

0214

5E+0

2584E+

0114

9E+0

2572E+

01499E+

03535E+

03981E+

01225E-01

500

724E

+02

154E

+02

181E+0

3274E+

0292

4E+0

217

2E+0

2577E+

04804

E+03

496

E+02

623E-01

1000

133E

+03

263E+

02914E

+03

126E

+03

179E

+03

158E

+02

837E+

0515

2E+0

5997

E+02

358

E+01

F4100

398E-02

199E

-01

437E+

0076

5E+0

0438E-13

921E-14

471E+

02594E+

01000

E+00

000

E+00

500

398E-02

199E

-01

364

E+02

523E+

0117

9E-11

631E-11

349E+

03112E

+02

000

E+00

000

E+00

1000

199E

-01

406

E-01

182E

+03

137E

+02

137E

-10

337E-10

758E

+03

151E+0

2493E-11

287

E-12

F5100

345E-03

488E-03

307E-14

786E

-15

341E-14

116E

-14

372E-01

560

E-02

000

E+00

000

E+00

500

118E

-03

461E-03

690E-04

241E-03

212E-13

247E-14

164E

+00

469E-02

216E-16

756E

-17

1000

118E

-03

327E-03

358E-03

573E-03

418E-13

278

E-14

589E+

00391E-01

549E-13

235E-14

F6100

144E

-13

306

E-14

113E

-13

153E

-14

111E-13

786E

-15

206

E+00

440

E-01

321E-15

248

E-15

500

534E-13

861E-14

480E-01

573E-01

534

E-13

701E-14

664

E+00

449E-01

674E-12

439E-12

1000

102E

-12

168E

-13

229E+

00298E-01

106E

-12

768E

-14

189E

+01

249E+

00239E-07

213E-07

F7100

-150E

+03

104E

+01

-137

E+03

246

E+01

-154E

+03

252E+

00-855E

+02

135E

+01

-700e+0

2664

E-01

500

-723E

+03

416E+

01-574

E+03

183E

+02

-743E

+03

803E+

00-351E+0

3210E+

01-883e+0

3354

E+00

1000

-143E

+04

827E+

01-105E

+04

418E+

02-147E

+04

151E+0

1-662E

+03

318E+

01-369E

+04

216E+

02

14 Complexity

CCPSO2 for function F7 in the case of 100-dimensionalthe two algorithms obtain similar optimization results forthe function F6 in the case of 500-dimensional In additionat first glance it might seem that HGGWA and MLCChave similar performance However the HGGWA achievesbetter convergence accuracy than MLCC under the samenumber of iterations In particular the HGGWA algorithmrsquosoptimization results are superior toDEwSAcc andEPUS-PSOin all dimensions The result of HGGWA scaled very wellfrom 100-D to 1000-D on F1 F4 F5 and F6 outperformingDEwSAcc and EPUS-PSO on all the cases

54 Sensitivity Analysis of Nonlinear Adjustment Coefficient 119896This section mainly discusses the effect of nonlinear adjust-ment coefficients on the convergence performance of thealgorithm In the HGGWA algorithm the role of parameter119886 is to balance global exploration capabilities and localexploitation capabilities In (12) the nonlinear adjustmentcoefficient 119896 is a key parameter and is mainly used to controlthe range of variation of the convergence factor Thereforewe selected four different values for numerical experimentsto analyze the effect of nonlinear adjustment coefficients onthe performance of the algorithm The four different valuesare 051152 and the results of the comparison are shown inTable 6 Among them black bold represents the best result ofsolving in several algorithms

In general the value of the nonlinear adjustment coef-ficient 119896 has no significant effect on the performance ofthe HGGWA algorithm On closer inspection one can seethat besides the function 1198913 the optimization performanceof the algorithm is optimal when the nonlinear adjustmentcoefficient 119896 is 05 But for function 1198913 119896 = 15 is the bestchoice For functions 1198917 and 1198919 the HGGWA algorithmobtains the optimal solution 0 in the case of several differentvalues of the coefficient 119896 Therefore for most test functionsthe optimal value of the nonlinear adjustment factor 119896 is 0555 Global Convergence Analysis of HGGWA In theHGGWA algorithm the following four improved strategiesare used to ensure the global convergence of the algorithm(1) initial population strategy based on Opposition-BasedLearning (2) selection operation based on optimal retention(3) crossover operation based on population partitioningmechanism and (4) mutation operation for elite individ-uals

The idea of solving the high-dimensional function opti-mization problem byHGGWAalgorithm is as follows Firstlyfor the initial population strategy the basic GWO generatesthe initial population in a random manner However it willhave a great impact on the search efficiency of the algo-rithm for high-dimensional function optimization problemsThe algorithm cannot effectively search the entire problemsolution space if the initial grey wolf population is clusteredin a small range Therefore the Opposition-Based Learningstrategy is used to initialize the population which can makethe initial population evenly distributed in the solution spacethus laying a good foundation for the global search of thealgorithm Secondly the optimal retention strategy is usedto preserve the optimal individual of the parent population

in the process of population evolution As a result the nextgeneration of population can evolve in the optimal directionIn addition the individuals with higher fitness are selectedby the gambling roulette operation which maintains thedominance relationship between individuals in the popula-tion This dominance relationship makes the algorithm withgood global convergence Thirdly the crossover operation iscompleted under the condition of population partitioningin order to achieve the purpose of dimension reductionand maintain the diversity of the population Finally thesearch direction of the grey wolf population is guided by themutation operation of the elite individuals which effectivelyprevents the algorithm from falling into the local optimalsolution All in all through the improvement of these fourdifferent strategies the HGGWA algorithm shows goodglobal convergence in solving high-dimensional functionoptimization problems

6 Conclusion

In order to overcome the shortcomings of GWO algorithmfor solving large-scale global optimization problems whichare easy to fall into local optimum this paper proposes aHybrid Genetic GreyWolf Algorithm (HGGWA) by integrat-ing three genetic operators into the algorithm The improvedalgorithm initializes the population based on theOpposition-Based Learning strategy for improving the search efficiencyof the algorithm The strategy of population division basedon cooperative coevolution reduces the scale of the problemand the mutation operation of the elite individuals effectivelyprevents the algorithm from falling into the local optimaTheperformance of HGGWA has been evaluated using 10 classi-cal benchmark functions and 7 CECrsquo2008 high-dimensionalfunctions Fromour experimental results several conclusionscan be drawn

The results have shown that it is capable of groupinginteracting variables with great accuracy for the major-ity of the benchmark functions A comparative studyamong the HGGWA and other state-of-the-art algorithmswas conducted and the experimental results showed thatthe HGGWA algorithm exhibits better global convergencewhether it is solving separable functions or nonseparablefunctions(1)By testing 10 classical benchmark functions comparedwith the results of WOA SSA and ALO the HGGWA algo-rithm has been greatly improved in convergence accuracy Inaddition to the Schwefel problem and Rosenbrock functionthe HGGWA algorithm achieves an 80-100 successfulratio for all benchmark functions with 100 dimensionswhich proves the effectiveness of HGGWA for solving high-dimensional functions(2) By comparing with the optimization results on sevenCECrsquo2008 large-scale global optimization problems theresults show that HGGWA algorithm achieves the globaloptimal value for the separable functions F1 F4 and F6but the effect on other nonseparable problems is not verysatisfactory However HGGWAalgorithm still exhibits betterglobal convergence than the other algorithms CCPSO2DewSAcc MLCC and EPUS-PSO on most of the functions

Complexity 15

Table6Com

paris

onof

optim

izationperfo

rmance

ofdifferent119896val

uesfor

HGGWA

Functio

n119896=0

5119896=1

119896=15

119896=2Mean

Std

Mean

Std

Mean

Std

Mean

Std

119891 1843

E-60

536E-60

225E-54

203E-54

558E-53

431E-53

231E-51

316E-51

119891 2289

E-34

231E-34

206

E-33

189E

-33

470E-32

365E-32

576E-30

422E-30

119891 3111E+0

010

2E+0

0314E+

01219E+

01403E-01

238E-01

360

E+01

216E+

01119891 4

978

E+01

836E+

0198

2E+0

174

3E+0

198

1E+0

173

4E+0

198

3E+0

1461E+

01119891 5

218E-03

134E

-03

887E-02

667E-02

781E+0

0613E+

00174E

-02

234E-02

119891 613

1E-03

113E

-03

162E

-03

243E-04

299E-03

104E

-03

184E

-03

134E

-03

119891 7000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 815

8E-14

123E

-15

293E-14

137E

-14

222E-14

149E

-14

251E-14

108E

-14

119891 9000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 10299

E-02

249E-02

338E-02

124E

-03

842E-02

614E-02

729E

-02

813E-03

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 2: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

2 Complexity

also called Cooperative Coevolution (CC) and its effec-tiveness for solving large-scale optimization problems hasbeen demonstrated in many classical optimization methodsBut a major difficulty in applying CC is the choice of agood decomposition strategy because different decompo-sition strategies may lead to different optimization effectsDue to the diversity of problem-solving there may be verylarge differences between different problems Therefore it isnecessary to study the structure inherent in the problem andanalyze the relationship between the variables in order to finda suitable solution

In addition from an algorithmic perspective two ormoredistinct methods when combined together in a synergisticmanner can enhance the problem-solving capability of thederived hybrid [8] EAs hybridized with local search algo-rithms have been successful within function optimizationdomain These kinds of EAs are often named as memeticalgorithms (MAs) [9] The optimization performance of thealgorithm for large-scale optimization problems is improvedby different optimization strategies such as designing newmutation operators dynamic neighborhood search strate-gies multiple classifier [10] and Opposition-Based Learn-ing [11] Similar to evolutionary computation the swarmintelligence (SI) is a kind of optimization algorithm inspiredby the biological foraging or hunting behavior in naturewhich simulates the intelligent behavior of insects birdgroups ant colonies or fish schools Inspired by the greywolf population hierarchy and hunting behavior Mirjaliliproposed another swarm intelligence optimization algo-rithm grey wolf optimization algorithm (GWO) in 2014[12] The GWO algorithm has the advantages of less controlparameters and fast convergence and it has been applied tovarious optimization fields such as medical image fusion[13] multiple input multiple output power systems [14]and job shop scheduling problem [15] In addition theproposal of different prediction techniques [16] also pro-vides reference for the study of large-scale optimizationproblems

To direct at the large-scale global optimization problemsthis paper improves the HGGWA algorithm proposed inliterature [17] by adding a parameter nonlinear adjustmentstrategy which further improves the global convergence andsolution accuracy The crossover operation of the HGGWAalgorithm divides the whole population into several subpop-ulations by referring to the idea of cooperative coevolutionand then independently evolves each subpopulation sepa-rately More specifically this paper has the following researchobjectives(1) To review the current literature on solving large-scaleglobal optimization problems and to analyze the existingproblems in the research(2) To improve the Hybrid Genetic Grey Wolf Algorithm(HGGWA) by adding the nonlinear adjustment strategy ofparameter which further improves the global convergenceand solution accuracy for solving large-scale global optimiza-tion problems(3) To analyze the global convergence and computationalcomplexity of the improved HGGWA algorithm and toverify the effectiveness of HGGWA for solving large-scale

global optimization problems by several different numericalexperiments

The remainder of this paper is organized as followsin Section 2 a review of Large-Scale Global Optimiza-tion (LSGO) and various solving strategies is given ThenSection 3 briefly describes the Grey Wolf Optimizer (GWO)In Section 4 the principles and steps of the improved algo-rithm (HGGWA) are introduced Section 5 illustrates andanalyzes the experimental results Finally the conclusions aredrawn in Section 6

2 Literature Review

21 Decomposition-Based CC Strategy Large-scale globaloptimization problems aremainly solved in the following twoways One is Cooperative Coevolution (CC) with problemdecomposition strategy The idea of decomposition was firstproposed by Potter and De Jong [18 19] in 1994 Theydesigned two Cooperative Coevolutionary Genetic Algo-rithms (CCGA-1 CCGA-2) to improve the global conver-gence of GA The CC strategy is to reduce the scale of thehigh-dimensional problem by dividing the whole solutionspace into multiple subspaces and then use multiple sub-populations to achieve the cooperative coevolution Table 1summarizes proposed major variants of the CC method Itcan be divided into the static grouping-based CC methodsand the dynamic grouping-based CCmethods

The static grouping-based CC methods divide the popu-lation into multiple fixed-scale subpopulations The Cooper-ative Coevolutionary Genetic Algorithm (CCGA) proposedby Potter and De Jong is the first attempt to combinethe idea of Cooperative Coevolution with metaheuristicalgorithmThey decomposed an n-dimensional problem inton 1-dimensional subproblems each of which is optimized bya separated GA Van den Bergh et al [20] firstly proposedtwo improved PSO by integrating cooperative coevolutionstrategy into PSO called CPSO-SK and CPSO-HK whichdivides an n-dimensional problem into k s-dimensionalsubproblems Similarly Mohammed EI-Abd [21] presentedtwo cooperative coevolutionary ABC algorithms namelyCABC-S and CABC-H Shi et al [22] applied the cooperativecoevolution strategy to the differential evolution algorithmcalled the cooperative coevolutionary differential evolution(CCDE) The algorithm partitions high-dimensional solu-tion space into two equal-sized subcomponents but it doesnot perform well as the dimensionality increases For afully separable problem with no interrelationship betweenvariables each variable can be optimized independently toobtain the optimal solution for the entire problem Howeverfor the nonseparable problem of the relationship betweenvariables the optimization effect of the fixed-scale groupingstrategy will be worse and even the optimal solution willnot be obtained Therefore many scholars began to studythe decomposition strategy that can solve the nonseparableproblems

The dynamic grouping-based CC methods dynamicallyadjust the size of the subpopulation to accommodate differenttypes of large-scale optimization problems They are efficientto handle nonseparable problems Yang et al [23] proposed

Complexity 3

Table1Asummaryof

major

varia

ntso

fthe

CCmetho

d

Year

Author

Algorith

mYear

Author

Algorith

m1994

Potte

rand

DeJon

g[18]

CCGA-

1CCGA-

22011

Omidvare

tal[29]

CBCC

1996

RStorn[30]

CCD

E2012

LiandYao[31]

CCP

SO2

1999

WeickerK

etal[32]

CCL

VI

2012

Sayedetal[33]

HDIM

A2001

Liuetal[34]

FEPC

C2012

Sayedetal[35]

DIM

A2004

VandenBe

rghetal[20]

CPSO

-SK

CPSO

-HK

2013

RenandWu[36]

CCOABC

2005

Shietal[22]

CCD

E2013

LiuandTang

[37]

CC-

CMA-

ES2008

Yang

etal[23]

DEC

C-G

2014

Omidvare

tal[28]

DEC

C-DG

2008

Yang

etal[38]

MLC

C2014

Omidvare

tal[39]

MLS

oft2009

Lietal[24]

CCP

SO2014

Mahdavietal[40]

DM-H

DMR

2009

Rayetal[26]

CCE

A-AV

P2015

Kazimipou

retal[41]

CBCC

2010

Omidvare

tal[25]

DEC

C-ML

2015

Cao

etal[42]

CC-

GLS

2010

Chen

etal[27]

CCV

IL2016

Peng

etal[43]

mCCE

A2010

Sing

hetal[44

]CCE

A-AV

P2017

Peng

etal[45]

MMO-C

C2010

Moh

ammed

El-Abd

[21]

CABC

-SC

ABC

-H2017

Yang

etal[46

]CCF

R2011

Omidvare

tal[47]

DEC

C-D

2018

Peng

etal[48]

CC-

SMP

4 Complexity

a new cooperative coevolution framework that is capableof optimizing large-scale nonseparable problems It uses arandom grouping strategy to divide the problem solutionspace into multiple subspaces of different sizes Li et al[24] embedded a random grouping strategy and an adaptiveweighting mechanism into the particle swarm optimizationalgorithm and proposed a cooperative coevolutionary parti-cle swarm optimization (CCPSO) algorithm The results ofthe randomgroupingmethod show the effective performanceto solve the scalable nonseparable benchmark function (upto 1000D) However as the number of interaction variablesincreases its performance becomes invalid [25] Thereforesome scholars consider adding a priori knowledge of theproblem in the process of algorithm evolution and proposemany learning-based dynamic grouping strategies Ray etal [26] discussed the effects of problem size the numberof subpopulations and the number of iterations on someseparable and nonseparable problems in cooperative coevo-lutionary algorithms Then a Cooperative CoevolutionaryAlgorithm with Correlation based Adaptive Variable Par-titioning (CCEA-AVP) is proposed to deal with scalablenonseparable problems Chen et al [27] proposed a CCmethodwithVariable Interaction Learning (CCVIL) which isable to adaptively change group sizes Considering that thereis no prior knowledge that cannot decompose the problemOmidvar et al [28] proposed an automatic decompositionstrategy which can keep the correlation between decisionvariables at the minimum

22 Nondecomposition Strategy The other is nondecompo-sition methods Different from the decomposition-basedstrategy the nondecomposition methods mainly study thealgorithm itself and improve the corresponding opera-tor in order to improve the performance of the algo-rithm for solving large-scale global optimization problemsNondecomposition-based methods mainly include swarmintelligence evolutionary computation and local search-based approaches Hsieh et al [49] added variable particlesand information sharing mechanism to the basic particleswarm optimization algorithm and proposed a variationcalled the Efficient PopulationUtilization Strategy for ParticleSwarm Optimizer (EPUS-PSO) A modified PSO algorithmwith a new velocity updating method as the rotated particleswarm was proposed in literature [50] It transforms coordi-nates and uses information from other dimensions to main-tain the diversity of each dimension Fan et al [51] proposeda novel particle swarm optimization approach with dynamicneighborhood that is based on kernel fuzzy clustering andvariable trust region methods (called FT-DNPSO) for large-scale optimization In terms of evolutionary computationHedar et al [52] modified genetic algorithm with newstrategies of population partitioning and space reductionfor high-dimensional problems In order to improve theperformance of differential evolution algorithm Zhang et al[53] proposed an adaptive differential evolution algorithm(called JADE) using a new mutation and adaptive controlparameter strategy Local search-based approaches have beenrecognized as an effective algorithm framework for solvingoptimization problems Hvattum et al [54] introduced a new

direct search method based on Scatter Search designed toremedy the lack of a good derivative-free method for solvingproblems of high dimensions Liu et al [55] improved thelocal search depth parameter in the memetic algorithm andproposed an adaptive local search depth (ALSD) strategywhich can arrange the local search computing resourcesaccording to its performance dynamically

23 Related Work of GWO Grey Wolf Optimizer (GWO) isa recently proposed swarm intelligence algorithm inspired bythe social hierarchy and hunting behavior of wolves It has theadvantages of less control parameters high solution accuracyand fast convergence speed Compared with other classicalmetaheuristic algorithms such as genetic algorithm (GA)particle swarm optimization (PSO) and gravitational searchalgorithm (GSA) the GWO shows powerful explorationcapability and better convergence characteristics Owingto its simplicity and ease of implementation GWO hasgained significant attention and has been applied in solv-ing many practical optimization problems since its inven-tion

Recently many scholars have applied the GWO to differ-ent optimization problems and also proposed many varietiesin order to improve the performance of GWO Saremi et al[56] introduced a dynamic population updating mechanismin the GWO removing individuals with poor fitness inthe population and regenerating new individuals to replacethem in order to enhance the exploration ability of GWOHowever the diversity of grey population was reduced by thisstrategy which leads to the algorithm being local optimumZhu et al [57] integrated the idea of differential evolutioninto the GWO which prevented the early stagnation ofthe GWO algorithm and accelerated the convergence speedHowever it fails to converge in high-dimensional problemsKumar et al [58] designed a novel variant of GWO fornumerical optimization and engineering design problemsThey utilized the preyweight and astrophysics-based learningconcept to enhance the exploration ability However thismethod also does not show good performance in solvinghigh-dimensional function problems In the application ofthe GWO algorithm Guha et al [59] applied the grey wolfalgorithm to the load control problem in large-scale powersystems In literature [60] a multiobjective discrete grey wolfalgorithm is proposed for the uniqueness of welding jobscheduling problem The results show that the algorithm cansolve the scheduling problem in practical engineering Emaryet al [61] proposed a new binary grey wolf optimizationalgorithm for the selection of the best feature subset

The swarm intelligence optimization algorithm has natu-ral parallelism so it has advantages in solving large-scale opti-mization problems In order to study the ability of grey wolfoptimization algorithm to solve high-dimensional functionsLong et al [62] proposed a nonlinear adjustment strategy forconvergence factors which balanced well the exploration andexploitation of the algorithm But this strategy also does notsolve the high-dimensional function optimization problemwell In 2017 Gupta et al [63] studied the performance ofthe GWO algorithm for solving 100 to 1000-dimensionalfunction optimization problems The results indicate that

Complexity 5

GWO is a powerful nature-inspired optimization algorithmfor large-scale problems except Rosenbrock function

In summary the decomposition-based CC strategy has agood effect in solving the nonseparable problem but it cannotsolve the imbalance problem well and it still needs to beimproved in the accuracy of the solution In addition in themethod based on nondecomposition strategy although theperformance of many algorithms has been greatly improvedbut the research on hybrid algorithms is still rare Thereforethis paper combines three genetic operators of selectioncrossover and mutation with GWO algorithm and proposesa Hybrid Genetic Grey Wolf Algorithm (HGGWA) Themain improvement strategies of the algorithm are as fol-lows (1) initialize the population based on the Opposition-Based Learning strategy (2) the nonlinear adjustment ofparameters effectively balances exploration and exploitationcapabilities while accelerating the convergence speed of thealgorithm (3) select the population through a combinationof elite retention strategy and gambling roulette (4) basedon the method of dimensionality reduction and populationdivision the whole population is divided into multiplesubpopulations for cross-operation to increase the diversityof the population (5) perform mutation operations on eliteindividuals in the population to prevent the algorithm fromfalling into local optimum

3 Grey Wolf Optimization Algorithm

31 Mathematical Models In the GWO algorithm [12] theposition of 120572 wolf 120573 wolf and 120575 wolf is the fittest solutionthe second-best solution and the third-best solution respec-tively And the positions of the 120596 wolves are the remainingcandidate solutions In the whole process of searching forprey in the space the 120596 wolves gradually update theirpositions according to the optimal position of the 120572 120573 and 120575wolves So they can gradually approach and capture the preythat is to complete the process of searching for the optimalsolution The position updating imaginary diagram in GWOis shown in Figure 1

The grey wolves position update formula in this processis as follows 997888rarr119863 = 100381610038161003816100381610038161003816997888rarr119862 lowast 997888997888rarr119883119901 (119905) minus 997888rarr119883 (119905)100381610038161003816100381610038161003816 (1)

997888rarr119883 (119905 + 1) = 997888997888rarr119883119901 (119905) minus 997888rarr119860 lowast 997888rarr119863 (2)

where 997888rarr119863 indicates the distance between the grey wolfindividual and the prey 119905 indicates the current iteration 997888rarr119860and 997888rarr119862 are the coefficient vectors 997888rarr119860 is a convergence coef-ficient used to balance the exploration and the exploitation997888rarr119862 is used to simulate the effects of nature 997888997888rarr119883119901 is the positionvector of the prey and 997888rarr119883 indicates the position vector of agrey wolf

The coefficient vectors 997888rarr119860 and 997888rarr119862 are calculated as follows997888rarr119860 = 2997888rarr119886 lowast 997888rarr1199031 minus 997888rarr119886 (3)

997888rarr119862 = 2 lowast 997888rarr1199032 (4)

where components of 997888rarr119886 are linearly decreased from 2to 0 over the course of iterations and 1199031 and 1199032 are randomvectors in [0 1] At the same time the fluctuation range ofthe coefficient vector 997888rarr119860 gradually decreases as 997888rarr119886 decreases

Thus according to (1) and (2) the grey wolf individ-ual randomly moves within the search space to graduallyapproach the optimal solution The specific location updateformula is as follows 997888997888rarr119863120572 = 100381610038161003816100381610038161003816997888rarr1198621 lowast 997888997888rarr119883120572 minus 997888rarr119883100381610038161003816100381610038161003816 (5)

997888997888rarr119863120573 = 1003816100381610038161003816100381610038161003816997888rarr1198622 lowast 997888997888rarr119883120573 minus 997888rarr1198831003816100381610038161003816100381610038161003816 (6)

997888997888rarr119863120575 = 1003816100381610038161003816100381610038161003816997888rarr1198623 lowast 997888997888rarr119883120575 minus 997888rarr1198831003816100381610038161003816100381610038161003816 (7)

997888997888rarr1198831 = 997888997888rarr119883120572 minus 997888rarr1198601 lowast 997888997888rarr119863120572 (8)

997888997888rarr1198832 = 997888997888rarr119883120573 minus 997888rarr1198602 lowast 997888997888rarr119863120573 (9)

997888997888rarr1198833 = 997888997888rarr119883120575 minus 997888rarr1198603 lowast 997888997888rarr119863120575 (10)

997888rarr119883 (119905 + 1) = 997888997888rarr1198831 + 997888997888rarr1198832 + 997888997888rarr11988333 (11)

where997888997888rarr119863120572 997888997888rarr119863120573 997888997888rarr119863120575 indicate the distance between the 120572 120573120575 wolves and the prey respectively 997888997888rarr1198831 997888997888rarr1198832 997888997888rarr1198833 represent thepositional components of the remaining grey wolves basedon the positions of 120572 120573 and 120575 wolves respectively 997888rarr119883(119905 + 1)represents the position vector after the grey wolf is updated

32 Basic Process of GWO

Step 1 Randomly generate an initial population and initial-ize parameters 997888rarr119886 997888rarr119860 and 997888rarr119862 Step 2 Calculate the fitness of the grey wolf individual andsave the positions of the first three individuals with thehighest fitness as 997888rarr119883120572 997888rarr119883120573 and 997888rarr119883120575Step 3 Update the position information of each grey wolfaccording to (5) to (11) thereby obtaining the next generationpopulation and then update the values of parameters 997888rarr119886 997888rarr119860and 997888rarr119862 Step 4 Calculate the fitness of individuals in the new popu-lation and update 997888rarr119883120572 997888rarr119883120573 and 997888rarr119883120575Step 5 Repeat Steps 2ndash4 until the optimal solution isobtained or the maximum number of iterations is reached

4 Hybrid Genetic Grey Wolf Algorithm

This section describes the details of HGGWA the hybridalgorithm proposed in this paper Firstly the initial pop-ulation strategy based on Opposition-Based Learning is

6 Complexity

D

D

D

Move

C1 a1

C2 a2

a3C3

R

δ

or other wolves

The estimated position of the prey

Figure 1 Position updating in GWO

described in detail in Section 41 Secondly Section 42 intro-duces the nonlinear adjustment strategy for parameter 119886Then three operations of selection crossover and mutationare carried out in Sections 43ndash45 respectively Finally thetime complexity of HGGWA is analyzed in Section 46

41 Initial Population Strategy Based on Opposition-BasedLearning For the swarm intelligence optimization algo-rithm based on population iteration the diversity of initialpopulations lays the foundation for the efficiency of thealgorithm The better diversity of the population will reducethe computation time and improve the global convergence ofthe algorithm [64] However like other algorithms the GWOalgorithm also uses random initialization when generatingpopulations which will have a certain impact on the searchefficiency of the algorithm Opposition-Based Learning is astrategy proposed by Tizhoosh [65] to improve the efficiencyof algorithm search It uses the opposite points of known indi-vidual positions to generate new individual positions therebyincreasing the diversity of search populations Currently theOpposition-Based Learning strategy has been successfullyapplied to multiple swarm optimization algorithms [66 67]The opposite point is defined as follows

Assume that there is an individual 119883 (1199091 1199092 119909119863) inthe population P then the opposite point of the element119909119894 (119894 = 1 2 119863) on each dimension is 1199091015840119894 = 119897119894 + 119906119894 minus119909119894 where 119897119894 and 119906119894 are the lower bound and the upperof the 119894th dimension respectively According to the abovedefinition the Opposition-Based Learning strategy initializesthe population as follows

(a) Randomly initialize the population P calculate theopposite point 1199091015840119894 of each individual position 119909119894 and all theopposite points constitute the opposition population P1015840

(b) Calculate the fitness of each individual in the initialpopulation P and the opposition population P1015840 and arrangethem in descending order

(c) Select the top N grey wolf individuals with the highestfitness as the final initial population

42 Parameters Nonlinear Adjustment Strategy It is sig-nificantly important how to balance the exploration andexploitation for swarm intelligence algorithms In the earlyiteration of the algorithm the powerful exploration capabilityis beneficial for the algorithm to expand the search rangeso that the algorithm can search for the optimal solutionwith greater probability In the later stage of the algorithmrsquositeration the effective exploitation ability can speed up theoptimization process of the algorithm and improve the con-vergence accuracy of the final solution Therefore only whenthe swarm optimization algorithm can better coordinate itsglobal exploration and local exploitation capabilities can ithave strong robustness and faster convergence speed

It can be seen from (2) and (3) that the dynamic changeof the parameter 119886 plays a crucial role in the algorithm Inthe basic GWO the linear decrement strategy of parameter119886 does not reflect the actual convergence process of the algo-rithm Therefore the algorithm does not balance the explo-ration and exploitation capabilities well especially when solv-ing large-scale multimodal function problems In order todynamically adjust the global exploration and local exploita-tion process of the algorithm this paper proposes a nonlinearadjustment strategy for the parameter a which is beneficialforHGGWA to search for the optimal solutionThe improvedparameter nonlinear adjustment equation is as follows

119886 = 119886119898119886119909 + (119886119898119894119899 minus 119886119898119886119909) sdot ( 11 + 119890119905119872119886119909119894119905119890119903 )119896

(12)

where 119886119898119886119909and 119886119898119894119899 are the initial value and the end valueof the parameter 119886 119905 is the current iteration index 119872119886119909119894119905119890119903is the maximum number of iterations and 119896 is nonlinearadjustment coefficient

Thus the variation range of the convergence factor Ais controlled by the nonlinear adjustment of the parameter119886 When |A| gt 1 the grey wolf population expandsthe search range to find better prey which corresponds tothe global exploration of the algorithm when |A| lt 1

Complexity 7

the whole population shrinks the search range thus anencirclement is formed around the prey to complete the finalattacking against the prey which corresponds to the localexploitation process of the algorithm The whole process hasa positive effect on balancing global search and local searchwhich is beneficial to improve the accuracy of the solutionand further accelerate the convergence speed of the algo-rithm

43 Selection Operation Based on Optimal Retention Forthe intelligent optimization algorithm based on populationevolution the evolution of each generation of populationdirectly affects the optimization effect of the algorithm Inorder to inherit the superior individuals in the paternalpopulation to the next generation without being destroyedit is necessary to preserve the good individuals directly Theoptimal retention selection strategy is an effective way topreserve good individuals in genetic algorithms This paperintegrates them into the GWO to improve the efficiencyof the algorithm Assuming that the current population isP(1198831 1198832 119883119863) the fitness of the individual 119883119894 is 119865119894 Thespecific operation is as follows

(a) Calculate the fitness 119865119894 (i = 1 2 D) of eachindividual and arrange them in descending order

(b) Selecting the individuals with the highest fitnessto copy directly into the next generation of popula-tion

(c) Calculate the total fitness S of the remaining indi-viduals and the probability 119901119894 that each individual isselected

S = 119863minus1sum119894=1

119865119894 (119894 = 1 2 119863 minus 1) (13)

119901119894 = 119865119894sum119863minus1119894=1 119865119894 (119894 = 1 2 119863 minus 1) (14)

(d) Calculate the cumulative fitness value 119904119894 for eachindividual and then the selection operation is performed inthemanner of the bet roulette until the number of individualsin the children population is consistent with the parentpopulation

119904119894 = sum119894119894=1 119865119894S(119894 = 1 2 119863 minus 1) (15)

44 Crossover Operation Based on Population Partition-ing Mechanism Due to the decrease of population diver-sity in the late evolution the GWO algorithm is easy tofall into local optimum for solving the large-scale high-dimensional optimization problem in order to overcomethe problems caused by large-scale and complexity andto ensure that the algorithm searches for all solutionspaces

In the HGGWA algorithm the whole population P isdivided into 120592 times 120578 subpopulations 119875119894119895 (119894 = 1 sdot sdot sdot 120592 119895 =1 120578) The optimal size of each subpopulation 119875119894119895 wastested to be 5 times 5 Individuals of each subpopulation were

cross-operated to increase the diversity of the populationThespecific division method is shown below

The initial population 119875[[[[[[[

11990911 11990912 sdot sdot sdot 119909111988911990921 11990922 sdot sdot sdot 1199092119889 d1199091198751 1199091198752 sdot sdot sdot 119909119875119889

]]]]]]]Subpopulations 119875119894119895

[[[[[

11990911 sdot sdot sdot 11990915 d11990951 sdot sdot sdot 11990955]]]]]sdot sdot sdot [[[[[

1199091(119889minus4) sdot sdot sdot 1199091119889 d1199095(119889minus4) sdot sdot sdot 1199095119889]]]]]11987511 sdot sdot sdot 1198751120578

[[[[[

119909(119875minus4)1 sdot sdot sdot 119909(119875minus4)5 d1199091198751 sdot sdot sdot 1199091198755

]]]]]sdot sdot sdot [[[[[

119909(119875minus4)(119889minus4) sdot sdot sdot 119909(119875minus4)119889 d119909119875(119889minus4) sdot sdot sdot 119909119875119889

]]]]]119875]1 sdot sdot sdot 119875]120578

(16)

In genetic algorithms cross-operation has a very impor-tant role and is the main way to generate new individuals Inthe improved algorithm of this paper each individual in thesubpopulation is cross-operated in a linear crossover mannerGenerating a corresponding random number 119903119894 isin(01) foreach individual 119909119894 in the subpopulation When the randomnumber 119903119894 is less than the crossover probability 119875119888 thecorresponding individual 119909119894 is paired for cross-operation

An example is as follows Crossover (p1 p2)(a) Generate a random number 120582 isin (0 1)(b) Two children 1198881(11988811 1198881119863) 1198882(11988821 1198882119863) are gener-

ated by two parents 1199011(11990111 1199011119863) and 1199012(11990121 1199012119863)1198881119894 = 1205821199011119894 + (1 minus 120582) 1199012119894 119894 = 1 2 119863 (17)

1198882119894 = 1205821199012119894 + (1 minus 120582) 1199011119894 119894 = 1 2 119863 (18)

45 Mutation Operation for Elite Individuals Due to theexistence of the selection operation based on the optimalpreservation strategy the grey wolf individuals in the wholepopulation are concentrated in a small optimal region inthe later stage of the iterative process which easily leadsto the loss of population diversity If the current optimalindividual is a locally optimal solution then the algorithm iseasy to fall into local optimum especially when solving high-dimensional multimodal functions To this end this paperintroduces the mutation operator in the HGGWA algorithmto perform mutation operations on elite individuals in thepopulation The specific operations are as follows

Assume that the optimal individual is 119909119894(1199091 1199092 119909119889)and the mutation operation is performed on 119909119894 with themutation probability 119875119898 That is select a gene 119909119896 from theoptimal individual with probability 119875119898 instead of the gene 119909119896with a random number between upper and lower bounds to

8 Complexity

(1) Initialize parameters a A C population size N 119875119888 and 119875119898(2) Initialize population 119875 using OBL the strategy(3) t=0(4) While tltMa119909119894119905119890119903(5) Calculate the fitness of all search agents(6) 119883120572= the best search agent(7) 119883120573= the second best search agent(8) 119883120575= the third best search agent(9) for i = 1N(10) Update the position of all search agents by equation (11)(11) end for(12) newP1larr997888P except the119883120572(13) for i = 1N-1(14) Generate newP2 by the Roulette Wheel Selection on newP1(15) end for(16) Plarr997888newP2 and119883120572(17) Generate subPs by the population partitioning mechanism on P(18) Select the individuals by crossover probability 119875119888(19) for i = 1Nlowast119875119888(20) Crossover each search agent by equation (17) and (18)(21) end for(22) Generate the1198831015840120572 1198831015840120573 and1198831015840120575 by equation (19) on mutation probability 119875119898(23) t = t+1(24) end while(25) Return 119883120572Algorithm 1 Pseudo code of the HGGWA

generate a new individual 1199091015840119894 = (11990910158401 11990910158402 1199091015840119889) The specificoperation is as follows

1199091015840119894 = 119897 + 120582 lowast (119906 minus 119897) 119894 = 119896119909119894 119894 = 119896 (19)

where 120582 is a random number in [0 1] and 119897 and 119906 are thelower and upper bounds of the individual 119909119894 respectively46 Pseudo Code of the HGGWA and Time ComplexityAnalysis This section describes how to calculate an upperbound for the total number of fitness evaluations (FE)required by HGGWA As shown in Algorithm 1 the compu-tational complexity of HGGWA in one generation is mainlydominated by the position updating of all search agents inline (10) The position updating requires a time complexityof O(Nlowastdimlowast119872119886119909119894119905119890119903)(N is the population size dim is thedimensions of each benchmark function and 119872119886119909119894119905119890119903 isthe maximum number of iterations of HGGWA) to obtainthe needed parameters and to finish the whole positionupdating progress of all search agents In addition theprocess of crossover operations is another time-consumingstep in line (20) It needs a time complexity of O(2lowast120592 lowast120578 lowast 119872119886119909119894119905119890119903)(120592 lowast 120578 is the total number of subpopulations120592=N5 120578=dim5) to complete the crossing of individuals ineach subpopulations according to the crossover probability119875119888 It should be noted that this is only the worst casecomputational complexity If there are only two individualsin each subpopulation that need to be crossed then theminimum amount of computation is O(120592 lowast 120578 lowast 119872119886119909119894119905119890119903)

Therefore the maximum computational complexity causedby the twodominant processes in the algorithm ismax 119874(Nlowast119889119894119898lowast119872119886119909119894119905119890119903) 119874(2lowast120592lowast120578lowast119872119886119909119894119905119890119903) (120592 = N5 120578 = 1198891198941198985)in one generation ie 119874(N lowast 119889119894119898 lowast119872119886119909119894119905119890119903)5 Numerical Experiments and Analysis

In this section the proposed HGGWA algorithm will beevaluated on both classical benchmark functions [68] andthe suite of test functions provided by CEC2008 SpecialSession on large-scale global optimization The algorithmsused for comparison include not only conventional EAs butalso other CC optimization algorithms Experimental resultsare provided to analyze the performance of HGGWA in thecontext of large-scale optimization problems In addition thesensitivity analysis of nonlinear adjustment coefficient andthe global convergence of HGGWA are discussed in Sections54 and 55 respectively

51 Benchmark Functions and Performance Measures Inorder to verify the ability ofHGGWAalgorithm to solve high-dimensional complex functions 10 general high-dimensionalbenchmark functions (100D 500D 1000D) were selected foroptimization test Among them 1198911 sim 1198916 are unimodalfunctions which are used to test the local searchability ofthe algorithm 1198917 sim 11989110 are multimodal functions which areused to test the global searchability of the algorithm Thesetest functions have multiple local optima points with unevendistribution nonconvexity and strong oscillation which arevery difficult to converge to the global optimal solutionespecially in the case of being high-dimensional The specific

Complexity 9

Table 2 10 high-dimensional benchmark functions

Functions Function formula Range 119891119898119894119899 Accuracy

Sphere 1198911 (119909) = 119889sum119894=1

1199092119894 [minus100 100] 0 1 times 10minus8Schwefelrsquos 222 1198912 (119909) = 119889sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816 + 119889prod119894=1

10038161003816100381610038161199091198941003816100381610038161003816 [minus10 10] 0 1 times 10minus8Schwefelrsquos 221 1198913 (119909) = max11989410038161003816100381610038161199091198941003816100381610038161003816 1 le 119894 le 119889 [minus100 100] 0 1 times 10minus2Rosenbrock 1198914 (119909) = 119889minus1sum

119894=1

[100 (119909119894+1 minus 1199092119894 )2 + (119909119894 minus 1)2] [minus30 30] 0 1 times 100Schwefelrsquos 12 1198915 (119909) = 119889sum

119894=1

( 119894sum119895=1

119909119895)2 [minus100 100] 0 1 times 10minus3Quartic 1198916 (119909) = 119889sum

119894=1

1198941199094119894 + 119903119886119899119889119900119898[0 1) [minus128 128] 0 1 times 10minus4Rastrigin 1198917 (119909) = 119889sum

119894=1

[1199092119894 minus 10 cos (2120587119909119894) + 10] [minus512 512] 0 1 times 10minus8Ackley 1198918 = minus20 exp(minus02radic 1119889

119889sum119894=1

1199092119894) minus exp(1119889119889sum119894=1

cos (2120587119909119894)) + 20 + 119890 [minus32 32] 0 1 times 10minus5Griewank 1198919 = 14000

119889sum119894=1

1199092119894 minus 119889prod119894=1

cos( 119909119894radic119894) + 1 [minus600 600] 0 1 times 10minus5

Penalized 1

11989110 = 120587119889 10 sin (1205871199101) +119889minus1sum119894=1

(119910119894 minus 1)2 [1 + sin2 (120587119910119894+1)] + (119910119899 minus 1)2

[minus50 50] 0 1 times 10minus2+ 119889sum119894=1

119906 (119909119894 10 100 4)119910119894 = 1 + 119909119894 + 14

119906(119909119894 119886 119896119898)

119896(119909119894 minus 119886)119898 119909119894 gt 1198860119896(minus119909119894 minus 119886)119898 119909119894 lt minus119886

characteristics of the 10 benchmark functions are shown inTable 2

In order to show the optimization effect of the algorithmmore clearly this paper selects two indicators to evaluate theperformance of the algorithm [69] The first is the solutionaccuracy (AC) which reflects the difference between theoptimal results of the algorithm and the theoretical optimalvalue In an experiment if the final convergence result ofthe algorithm is less than the AC then the optimization isconsidered successful Assuming that the optimal value of acertain optimization is119883119900119901119905 and the theoretical optimal valueis119883119898119894119899 then the AC is calculated as follows

119860119862 = 10038161003816100381610038161003816119883119900119901119905 minus 11988311989811989411989910038161003816100381610038161003816 (20)

The second is the successful ratio (SR) which reflectsthe proportion of the number of successful optimizationsto the total number of optimizations under the conditionthat the algorithm has a certain accuracy Assume that inthe 119879 iterations of the algorithm the number of successfuloptimizations is 119905 then the SR is calculated as follows

119878119877 = 119905119879 times 100 (21)

52 Results on Classical Benchmark Functions In this sec-tion a set of simulation tests were performed in order toverify the effectiveness of HGGWA Firstly the 10 high-dimensional benchmark functions in Table 2 are optimizedbyHGGWAalgorithm proposed in this paper and the resultsare compared with WOA SSA and ALO In addition therunning time of several algorithms is compared under thegiven convergence accuracy

521 Parameters Settings for the Compared Algorithms In allexperiments the values of the common parameters such asthe maximum number of iterations (119872119886119909119894119905119890119903) the dimensionof the functions (D) and the population sizes (N) were cho-sen the same For all test problems we focus on investigatingthe optimization performance of the proposed method onproblemswithD= 100 500 and 1000Themaximumnumberof iterations is set to 1000 and the population size is set to50 The parameter setting of WOA SSA and ALO is derivedfrom the original literature [70ndash72] The optimal parametersettings of the HGGWA algorithm proposed in this paper areshown in Table 3

For each experiment of an algorithm on a benchmarkfunction 30 independent runs are performed to obtain a

10 Complexity

Table 3 Parameter settings of the HGGWA

Parameters Definition ValueN Population size 50119875119888 Cross probability 08119875119898 Mutation probability 001119872119886119909119894119905119890119903 Maximum number of iterations 1000119896 Nonlinear adjustment coefficient 05

fair comparison among the different algorithms The optimalvalue the worst value the average value and the standarddeviation of each algorithm optimization are recorded andthen the optimization successful ratio (SR) is calculatedAll programs were coded in Matlab 2017b (Win64) andexecuted on a Lenovo computer with Intel (R) Core I5-6300HQ 8G ROM 230GHz under Windows 10 operatingsystem

522 Comparison with State-of-the-Art Metaheuristic Algo-rithms In order to verify the efficiency of the HGGWAalgorithm several state-of-the-art metaheuristic algorithmsrecently proposed are compared with their optimizationresults These algorithms are Whale Optimization Algorithm(WOA) Salp Swarm Algorithm (SSA) and Ant Lion Opti-mization Algorithm (ALO) The test was performed usingthe same 10 high-dimensional functions in Table 2 and theresults of the comparison are shown in Table 4

From Table 4 it can be known that the convergenceaccuracy and optimization successful ratio of the HGGWAalgorithm are significantly higher than the other three algo-rithms for most of the test functions In the case of 100dimensions HGGWA algorithm can converge to the globaloptimal solution for the other nine test functions at one timein addition to function 3 And the algorithm has a successfulratio of 100 for the nine functions under the condition ofcertain convergence accuracy The convergence result of theWOA algorithm is better than the other three algorithms forfunctions 1198911 and 1198912 and its robustness is excellent With theincrease of function dimension the convergence precisionof several algorithms decreases slightly but the optimizationresults of HGGWA algorithm are better than those of WOASSA and ALO In general the HGGWA algorithm exhibitsbetter optimization effects than the WOA SSA and ALOalgorithms in the 100- 500- and 1000-dimensional testfunctions which proves the effectiveness of the HGGWAalgorithm for solving high-dimensional complex functions

Generally speaking the HGGWA algorithm performsbetter than other algorithms in most test functions Inorder to compare the convergence performance of the fouralgorithms more clearly the convergence curves of the fouralgorithms on Sphere Schwefelrsquos 222 Rastrigin GriewankQuartic and Ackley functions with D=100 are shown inFigure 2 Sphere and Schwefelrsquos 222 represent the two mostbasic unimodal functions Rastrigin and Griewank representthe two most basic multimodal functions It is seen thatHGGWA has higher precision and faster convergence speedthan other algorithms

523 Comparative Analysis of Running Time To evaluate theactual runtime of the several compared algorithms includingSSAWOAGWO andHGGWA their average running times(in seconds s) from 30 runs on function 1198911 1198912 1198916 1198917 1198918 1198919and 11989110 are plotted in Figure 3 The convergence accuracy ofeach test function has been described in Table 2

Figure 3 shows the convergence behavior of differentalgorithms Each point on the plot was calculated by takingthe average of 30 independent runs As can be seen fromFigure 3 several algorithms show different effects under thesame calculation accuracy For the functions 1198911 1198912 1198917 11989181198919 the GWO algorithm has the fastest convergence rateas a result of the advantages of the GWO algorithm itselfHowever for the test functions 1198916 and 11989110 the runningtime of the HGGWA algorithm is the shortest Throughfurther analysis it is known that HGGWA algorithm adds alittle computing time compared to GWO algorithm becauseof several improved strategies such as selection crossoverand mutation However the convergence speed of HGGWAalgorithm is still better than SSA and WOA algorithmsOverall the convergence speed of the HGGWA algorithmperforms better in several algorithms compared Moreoverthe running time of different test functions is very small andit is kept between 1s and 3s which shows that the HGGWAalgorithm has excellent stability

53 Results on CECrsquo2008 Benchmark Functions This sectionprovides an analysis of the effectiveness of HGGWA in termsof the CECrsquo2008 large-scale benchmark functions and acomparison with four powerful algorithms which have beenproven to be effective in solving large-scale optimizationproblems Experimental results are provided to analyze theperformance of HGGWA for large-scale optimization prob-lems

We tested our proposed algorithms with CECrsquo2008 func-tions for 100 500 and 1000 dimensions and the means ofthe best fitness value over 50 runs were recorded WhileF1 (Shifted Sphere) F4 (Shifted Rastrigin) and F6 (ShiftedAckley) are separable functions F2 (Schwefel Problem) F3(Shifted Rosenbrock) F5 (Shifted Griewank) and F7 (FastFractal) are nonseparable presenting a greater challenge toany algorithm that is prone to variable interactions Theperformance of HGGWA is compared with other algorithmsCCPSO2 [31] DEwSAcc [5] MLCC [38] and EPUS-PSO[49] for 100 500 and 1000 dimensions respectively Themaximum number of fitness evaluations (FEs) was calculatedby the following formula FEs = 5000timesD where D is thenumber of dimensions In order to reflect the fairness of thecomparison results the optimization results of the other fouralgorithms are directly derived from the original literatureThe specific comparison results are shown in Table 5

Table 5 shows the experimental results and the entriesshown in bold are significantly better than other algorithmsA general trend that can be seen is that HGGWA algorithmhas a promising optimization effect for most of the 7 func-tions

Generally speaking HGGWAoutperforms CCPSO2 on 6out of 7 functions with 100 and 500 dimensions respectivelyAmong them the result of HGGWA is slightly worse than

Complexity 11

Table4Com

paris

onresults

of10

high

-dim

ensio

nalb

enchmarkfunctio

nsby

WOASSA

ALO

and

HGGWA

Functio

nDim

WOA

SSA

ALO

HGGWA

Mean

Std

SRMean

Std

SRMean

Std

SRMean

Std

SR

119891 1100

577

E-96

469

E-96

100

101E-02

124E

-02

0304

E-01

231E-01

0485E-53

241E-52

100

500

724E

-93

298

E-94

100

324E+

04218E+

040

734E

+04

132E

+04

0835E-25

886E-25

100

1000

205

E-89

316E-90

100

119E

+05

104E

+05

0248E+

05212E+

050

175E

-17

273E-17

100

119891 2100

189E

-99

233E-99

100

885E+

00237E+

000

429E+

02325E+

020

401E-33

434E-33

100

500

138E

-99

216E-99

100

337E+

02217E+

020

231E+

03203E+

030

958E

-18

236E-18

100

1000

180E

-99

137E

-98

100

845E+

02343E+

020

325E+

04413E+

040

250E-11

166E

-11

100

119891 3100

384E+

01217E+

000

222E+

0110

5E+0

10

247E+

0113

4E+0

10

455E+

01369E+

010

500

932E

+01

532E+

010

327E+

01233E+

010

392E+

01234E+

010

722E

+01

501E+

000

1000

986E

+01

324E+

010

331E+

0112

8E+0

10

493E+

01399E+

010

831E+

01436E+

000

119891 4100

967E

+01

398E+

010

900E

+02

516E+

020

140E

+03

103E

+03

0682

E+01

125E

-01

100

500

495E+

02314E+

010

718E

+06

146E

+06

0209E+

0713

7E+0

70

367

E+02

823E-01

01000

991E+0

2248E+

020

298E+

0713

5E+0

70

152E

+08

100E

+07

0964

E+02

374E+

010

119891 5100

783E

+05

236E+

050

217E+

0410

3E+0

40

360

E+04

216E+

040

345

E-05

277

E-05

100

500

198E

+07

132E

+07

0429E+

05316E+

050

118E

+06

103E

+06

0267

E-03

438

E-03

80

1000

668E+

07344

E+07

0248E+

0612

3E+0

60

372E+

06349E+

060

833E+

00512E+

000

119891 6100

265E-04

143E

-04

100

112E

+00

103E

+00

0828E-01

268E-01

0234

E-09

317E-10

100

500

394E-04

322E-04

60

497E+

01366

E+01

0113E

+02

101E+0

20

573E-05

433E-06

100

1000

216E-03

136E

-03

0393E+

02278E+

020

177E

+03

233E+

030

108E

-04

162E

-03

80

119891 7100

000

E+00

000

E+00

100

627E+

01436E+

010

354E+

02332E+

020

000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

204

E+03

197E

+03

0334E+

03456E+

030

000

E+00

000

E+00

100

1000

000

E+00

000

E+00

100

569E+

0316

6E+0

30

719E

+03

214E+

030

557E-11

315E-12

100

119891 8100

444

E-15

312E-15

100

515E+

00413E+

000

493E+

00349E+

000

215E-15

348

E-15

100

500

444

E-15

305

E-15

100

126E

+01

100E

+01

012

8E+0

1228E+

010

874E-12

539E-12

100

1000

888

E-16

288

E-15

100

128E

+01

144E

+01

014

6E+0

1203E+

010

159E

-07

163E

-07

100

119891 9100

000

E+00

000

E+00

100

152E

-01

244

E-01

0454E-01

316E-01

0000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

282E+

02213E+

020

392E+

0210

8E+0

20

184E

-16

844

E-17

100

1000

000

E+00

000

E+00

100

110E

+03

134E

+03

0361E+

03117E

+03

0231E-13

329E-14

100

119891 10100

816E-03

432E-03

100

110E

+01

219E+

010

174E

+01

133E

+01

0273E-07

311E-07

100

500

252E-02

199E

-02

80

516E+

01333E+

010

397E+

05296E+

050

812E-05

498

E-06

100

1000

143E

-02

103E

-02

80

100E

+05

213E+

040

696E+

0613

4E+0

60

633E-04

377

E-03

100

12 Complexity

WOASSAALOHGGWA

500 10000Number of iterations

10minus100

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(a) Sphere function

WOASSAALOHGGWA

500 10000Number of iterations

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(b) Schwefel 222 function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(c) Rastrigin function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(d) Griewank function

WOASSAALOHGGWA

200 400 600 800 10000Number of iterations

10minus5

100

105

Obj

ectiv

e fun

ctio

n va

lue

(e) Quartic function

WOASSAALOHGGWA

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

500 10000Number of iterations

(f) Ackley function

Figure 2 Convergence curves of WOA SSA ALO and HGGWA algorithms

Comparative analysis of running time

459

344

053

13

388

239

072

186

537

366

317

285

423

233

122

216

399

217

057

139

259

218

069

121

463

333352

195

0

1

2

3

4

5

6

Runn

ing

Tim

es (s

)

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

SSAWOAGWOHGGWA(a) Comparative analysis of running time

Comparative analysis of iterations

302

233

198221

334290

263237

723

614

491

407 398

258

365312 263

234201

175

214185

166149

186

85

298

132

SSAWOAGWOHGGWA

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

100

200

300

400

500

600

700

Num

ber o

f ite

ratio

ns

(b) Comparative analysis of iterations

Figure 3 The average running times and iterations of SSA WOA GWO and HGGWA on different test functions

Complexity 13

Table5Com

paris

onresults

of7CE

Crsquo200

8benchm

arkfunctio

nsby

fivea

lgorith

ms

Functio

nDim

CCP

SO2

DEw

SAcc

MLC

CEP

US-PS

OHGGWA

Mean

Std

Mean

Std

Mean

Std

Mean

Std

Mean

Std

F1100

773E

-14

323E-14

568E-14

000

E+00

682E-14

232E-14

747E

-01

170E

-01

306

E-17

293E-17

500

300

E-13

796E

-14

209E-09

461E-09

429E-13

331E-14

845E+

01640

E+00

275E-15

213E-15

1000

518E-13

961E-14

878E-03

527E-03

845E-13

500

E-14

553E+

02286E+

0110

6E-10

183E

-11

F2100

608E+

0078

3E+0

0825E+

00532E+

00252E+

01872E+

0018

6E+0

1226E+

00271E+

00239

E+00

500

579E+

01421E+

0175

7E+0

1304

E+00

666

E+01

569E+

00435E+

01551E-01

312E+

01491E+

001000

782E

+01

425E+

0196

0E+0

118

1E+0

010

8E+0

2475E+

00466

E+01

400

E-01

271E+

01566

E+00

F3100

423E+

02865E+

0214

5E+0

2584E+

0114

9E+0

2572E+

01499E+

03535E+

03981E+

01225E-01

500

724E

+02

154E

+02

181E+0

3274E+

0292

4E+0

217

2E+0

2577E+

04804

E+03

496

E+02

623E-01

1000

133E

+03

263E+

02914E

+03

126E

+03

179E

+03

158E

+02

837E+

0515

2E+0

5997

E+02

358

E+01

F4100

398E-02

199E

-01

437E+

0076

5E+0

0438E-13

921E-14

471E+

02594E+

01000

E+00

000

E+00

500

398E-02

199E

-01

364

E+02

523E+

0117

9E-11

631E-11

349E+

03112E

+02

000

E+00

000

E+00

1000

199E

-01

406

E-01

182E

+03

137E

+02

137E

-10

337E-10

758E

+03

151E+0

2493E-11

287

E-12

F5100

345E-03

488E-03

307E-14

786E

-15

341E-14

116E

-14

372E-01

560

E-02

000

E+00

000

E+00

500

118E

-03

461E-03

690E-04

241E-03

212E-13

247E-14

164E

+00

469E-02

216E-16

756E

-17

1000

118E

-03

327E-03

358E-03

573E-03

418E-13

278

E-14

589E+

00391E-01

549E-13

235E-14

F6100

144E

-13

306

E-14

113E

-13

153E

-14

111E-13

786E

-15

206

E+00

440

E-01

321E-15

248

E-15

500

534E-13

861E-14

480E-01

573E-01

534

E-13

701E-14

664

E+00

449E-01

674E-12

439E-12

1000

102E

-12

168E

-13

229E+

00298E-01

106E

-12

768E

-14

189E

+01

249E+

00239E-07

213E-07

F7100

-150E

+03

104E

+01

-137

E+03

246

E+01

-154E

+03

252E+

00-855E

+02

135E

+01

-700e+0

2664

E-01

500

-723E

+03

416E+

01-574

E+03

183E

+02

-743E

+03

803E+

00-351E+0

3210E+

01-883e+0

3354

E+00

1000

-143E

+04

827E+

01-105E

+04

418E+

02-147E

+04

151E+0

1-662E

+03

318E+

01-369E

+04

216E+

02

14 Complexity

CCPSO2 for function F7 in the case of 100-dimensionalthe two algorithms obtain similar optimization results forthe function F6 in the case of 500-dimensional In additionat first glance it might seem that HGGWA and MLCChave similar performance However the HGGWA achievesbetter convergence accuracy than MLCC under the samenumber of iterations In particular the HGGWA algorithmrsquosoptimization results are superior toDEwSAcc andEPUS-PSOin all dimensions The result of HGGWA scaled very wellfrom 100-D to 1000-D on F1 F4 F5 and F6 outperformingDEwSAcc and EPUS-PSO on all the cases

54 Sensitivity Analysis of Nonlinear Adjustment Coefficient 119896This section mainly discusses the effect of nonlinear adjust-ment coefficients on the convergence performance of thealgorithm In the HGGWA algorithm the role of parameter119886 is to balance global exploration capabilities and localexploitation capabilities In (12) the nonlinear adjustmentcoefficient 119896 is a key parameter and is mainly used to controlthe range of variation of the convergence factor Thereforewe selected four different values for numerical experimentsto analyze the effect of nonlinear adjustment coefficients onthe performance of the algorithm The four different valuesare 051152 and the results of the comparison are shown inTable 6 Among them black bold represents the best result ofsolving in several algorithms

In general the value of the nonlinear adjustment coef-ficient 119896 has no significant effect on the performance ofthe HGGWA algorithm On closer inspection one can seethat besides the function 1198913 the optimization performanceof the algorithm is optimal when the nonlinear adjustmentcoefficient 119896 is 05 But for function 1198913 119896 = 15 is the bestchoice For functions 1198917 and 1198919 the HGGWA algorithmobtains the optimal solution 0 in the case of several differentvalues of the coefficient 119896 Therefore for most test functionsthe optimal value of the nonlinear adjustment factor 119896 is 0555 Global Convergence Analysis of HGGWA In theHGGWA algorithm the following four improved strategiesare used to ensure the global convergence of the algorithm(1) initial population strategy based on Opposition-BasedLearning (2) selection operation based on optimal retention(3) crossover operation based on population partitioningmechanism and (4) mutation operation for elite individ-uals

The idea of solving the high-dimensional function opti-mization problem byHGGWAalgorithm is as follows Firstlyfor the initial population strategy the basic GWO generatesthe initial population in a random manner However it willhave a great impact on the search efficiency of the algo-rithm for high-dimensional function optimization problemsThe algorithm cannot effectively search the entire problemsolution space if the initial grey wolf population is clusteredin a small range Therefore the Opposition-Based Learningstrategy is used to initialize the population which can makethe initial population evenly distributed in the solution spacethus laying a good foundation for the global search of thealgorithm Secondly the optimal retention strategy is usedto preserve the optimal individual of the parent population

in the process of population evolution As a result the nextgeneration of population can evolve in the optimal directionIn addition the individuals with higher fitness are selectedby the gambling roulette operation which maintains thedominance relationship between individuals in the popula-tion This dominance relationship makes the algorithm withgood global convergence Thirdly the crossover operation iscompleted under the condition of population partitioningin order to achieve the purpose of dimension reductionand maintain the diversity of the population Finally thesearch direction of the grey wolf population is guided by themutation operation of the elite individuals which effectivelyprevents the algorithm from falling into the local optimalsolution All in all through the improvement of these fourdifferent strategies the HGGWA algorithm shows goodglobal convergence in solving high-dimensional functionoptimization problems

6 Conclusion

In order to overcome the shortcomings of GWO algorithmfor solving large-scale global optimization problems whichare easy to fall into local optimum this paper proposes aHybrid Genetic GreyWolf Algorithm (HGGWA) by integrat-ing three genetic operators into the algorithm The improvedalgorithm initializes the population based on theOpposition-Based Learning strategy for improving the search efficiencyof the algorithm The strategy of population division basedon cooperative coevolution reduces the scale of the problemand the mutation operation of the elite individuals effectivelyprevents the algorithm from falling into the local optimaTheperformance of HGGWA has been evaluated using 10 classi-cal benchmark functions and 7 CECrsquo2008 high-dimensionalfunctions Fromour experimental results several conclusionscan be drawn

The results have shown that it is capable of groupinginteracting variables with great accuracy for the major-ity of the benchmark functions A comparative studyamong the HGGWA and other state-of-the-art algorithmswas conducted and the experimental results showed thatthe HGGWA algorithm exhibits better global convergencewhether it is solving separable functions or nonseparablefunctions(1)By testing 10 classical benchmark functions comparedwith the results of WOA SSA and ALO the HGGWA algo-rithm has been greatly improved in convergence accuracy Inaddition to the Schwefel problem and Rosenbrock functionthe HGGWA algorithm achieves an 80-100 successfulratio for all benchmark functions with 100 dimensionswhich proves the effectiveness of HGGWA for solving high-dimensional functions(2) By comparing with the optimization results on sevenCECrsquo2008 large-scale global optimization problems theresults show that HGGWA algorithm achieves the globaloptimal value for the separable functions F1 F4 and F6but the effect on other nonseparable problems is not verysatisfactory However HGGWAalgorithm still exhibits betterglobal convergence than the other algorithms CCPSO2DewSAcc MLCC and EPUS-PSO on most of the functions

Complexity 15

Table6Com

paris

onof

optim

izationperfo

rmance

ofdifferent119896val

uesfor

HGGWA

Functio

n119896=0

5119896=1

119896=15

119896=2Mean

Std

Mean

Std

Mean

Std

Mean

Std

119891 1843

E-60

536E-60

225E-54

203E-54

558E-53

431E-53

231E-51

316E-51

119891 2289

E-34

231E-34

206

E-33

189E

-33

470E-32

365E-32

576E-30

422E-30

119891 3111E+0

010

2E+0

0314E+

01219E+

01403E-01

238E-01

360

E+01

216E+

01119891 4

978

E+01

836E+

0198

2E+0

174

3E+0

198

1E+0

173

4E+0

198

3E+0

1461E+

01119891 5

218E-03

134E

-03

887E-02

667E-02

781E+0

0613E+

00174E

-02

234E-02

119891 613

1E-03

113E

-03

162E

-03

243E-04

299E-03

104E

-03

184E

-03

134E

-03

119891 7000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 815

8E-14

123E

-15

293E-14

137E

-14

222E-14

149E

-14

251E-14

108E

-14

119891 9000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 10299

E-02

249E-02

338E-02

124E

-03

842E-02

614E-02

729E

-02

813E-03

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 3: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

Complexity 3

Table1Asummaryof

major

varia

ntso

fthe

CCmetho

d

Year

Author

Algorith

mYear

Author

Algorith

m1994

Potte

rand

DeJon

g[18]

CCGA-

1CCGA-

22011

Omidvare

tal[29]

CBCC

1996

RStorn[30]

CCD

E2012

LiandYao[31]

CCP

SO2

1999

WeickerK

etal[32]

CCL

VI

2012

Sayedetal[33]

HDIM

A2001

Liuetal[34]

FEPC

C2012

Sayedetal[35]

DIM

A2004

VandenBe

rghetal[20]

CPSO

-SK

CPSO

-HK

2013

RenandWu[36]

CCOABC

2005

Shietal[22]

CCD

E2013

LiuandTang

[37]

CC-

CMA-

ES2008

Yang

etal[23]

DEC

C-G

2014

Omidvare

tal[28]

DEC

C-DG

2008

Yang

etal[38]

MLC

C2014

Omidvare

tal[39]

MLS

oft2009

Lietal[24]

CCP

SO2014

Mahdavietal[40]

DM-H

DMR

2009

Rayetal[26]

CCE

A-AV

P2015

Kazimipou

retal[41]

CBCC

2010

Omidvare

tal[25]

DEC

C-ML

2015

Cao

etal[42]

CC-

GLS

2010

Chen

etal[27]

CCV

IL2016

Peng

etal[43]

mCCE

A2010

Sing

hetal[44

]CCE

A-AV

P2017

Peng

etal[45]

MMO-C

C2010

Moh

ammed

El-Abd

[21]

CABC

-SC

ABC

-H2017

Yang

etal[46

]CCF

R2011

Omidvare

tal[47]

DEC

C-D

2018

Peng

etal[48]

CC-

SMP

4 Complexity

a new cooperative coevolution framework that is capableof optimizing large-scale nonseparable problems It uses arandom grouping strategy to divide the problem solutionspace into multiple subspaces of different sizes Li et al[24] embedded a random grouping strategy and an adaptiveweighting mechanism into the particle swarm optimizationalgorithm and proposed a cooperative coevolutionary parti-cle swarm optimization (CCPSO) algorithm The results ofthe randomgroupingmethod show the effective performanceto solve the scalable nonseparable benchmark function (upto 1000D) However as the number of interaction variablesincreases its performance becomes invalid [25] Thereforesome scholars consider adding a priori knowledge of theproblem in the process of algorithm evolution and proposemany learning-based dynamic grouping strategies Ray etal [26] discussed the effects of problem size the numberof subpopulations and the number of iterations on someseparable and nonseparable problems in cooperative coevo-lutionary algorithms Then a Cooperative CoevolutionaryAlgorithm with Correlation based Adaptive Variable Par-titioning (CCEA-AVP) is proposed to deal with scalablenonseparable problems Chen et al [27] proposed a CCmethodwithVariable Interaction Learning (CCVIL) which isable to adaptively change group sizes Considering that thereis no prior knowledge that cannot decompose the problemOmidvar et al [28] proposed an automatic decompositionstrategy which can keep the correlation between decisionvariables at the minimum

22 Nondecomposition Strategy The other is nondecompo-sition methods Different from the decomposition-basedstrategy the nondecomposition methods mainly study thealgorithm itself and improve the corresponding opera-tor in order to improve the performance of the algo-rithm for solving large-scale global optimization problemsNondecomposition-based methods mainly include swarmintelligence evolutionary computation and local search-based approaches Hsieh et al [49] added variable particlesand information sharing mechanism to the basic particleswarm optimization algorithm and proposed a variationcalled the Efficient PopulationUtilization Strategy for ParticleSwarm Optimizer (EPUS-PSO) A modified PSO algorithmwith a new velocity updating method as the rotated particleswarm was proposed in literature [50] It transforms coordi-nates and uses information from other dimensions to main-tain the diversity of each dimension Fan et al [51] proposeda novel particle swarm optimization approach with dynamicneighborhood that is based on kernel fuzzy clustering andvariable trust region methods (called FT-DNPSO) for large-scale optimization In terms of evolutionary computationHedar et al [52] modified genetic algorithm with newstrategies of population partitioning and space reductionfor high-dimensional problems In order to improve theperformance of differential evolution algorithm Zhang et al[53] proposed an adaptive differential evolution algorithm(called JADE) using a new mutation and adaptive controlparameter strategy Local search-based approaches have beenrecognized as an effective algorithm framework for solvingoptimization problems Hvattum et al [54] introduced a new

direct search method based on Scatter Search designed toremedy the lack of a good derivative-free method for solvingproblems of high dimensions Liu et al [55] improved thelocal search depth parameter in the memetic algorithm andproposed an adaptive local search depth (ALSD) strategywhich can arrange the local search computing resourcesaccording to its performance dynamically

23 Related Work of GWO Grey Wolf Optimizer (GWO) isa recently proposed swarm intelligence algorithm inspired bythe social hierarchy and hunting behavior of wolves It has theadvantages of less control parameters high solution accuracyand fast convergence speed Compared with other classicalmetaheuristic algorithms such as genetic algorithm (GA)particle swarm optimization (PSO) and gravitational searchalgorithm (GSA) the GWO shows powerful explorationcapability and better convergence characteristics Owingto its simplicity and ease of implementation GWO hasgained significant attention and has been applied in solv-ing many practical optimization problems since its inven-tion

Recently many scholars have applied the GWO to differ-ent optimization problems and also proposed many varietiesin order to improve the performance of GWO Saremi et al[56] introduced a dynamic population updating mechanismin the GWO removing individuals with poor fitness inthe population and regenerating new individuals to replacethem in order to enhance the exploration ability of GWOHowever the diversity of grey population was reduced by thisstrategy which leads to the algorithm being local optimumZhu et al [57] integrated the idea of differential evolutioninto the GWO which prevented the early stagnation ofthe GWO algorithm and accelerated the convergence speedHowever it fails to converge in high-dimensional problemsKumar et al [58] designed a novel variant of GWO fornumerical optimization and engineering design problemsThey utilized the preyweight and astrophysics-based learningconcept to enhance the exploration ability However thismethod also does not show good performance in solvinghigh-dimensional function problems In the application ofthe GWO algorithm Guha et al [59] applied the grey wolfalgorithm to the load control problem in large-scale powersystems In literature [60] a multiobjective discrete grey wolfalgorithm is proposed for the uniqueness of welding jobscheduling problem The results show that the algorithm cansolve the scheduling problem in practical engineering Emaryet al [61] proposed a new binary grey wolf optimizationalgorithm for the selection of the best feature subset

The swarm intelligence optimization algorithm has natu-ral parallelism so it has advantages in solving large-scale opti-mization problems In order to study the ability of grey wolfoptimization algorithm to solve high-dimensional functionsLong et al [62] proposed a nonlinear adjustment strategy forconvergence factors which balanced well the exploration andexploitation of the algorithm But this strategy also does notsolve the high-dimensional function optimization problemwell In 2017 Gupta et al [63] studied the performance ofthe GWO algorithm for solving 100 to 1000-dimensionalfunction optimization problems The results indicate that

Complexity 5

GWO is a powerful nature-inspired optimization algorithmfor large-scale problems except Rosenbrock function

In summary the decomposition-based CC strategy has agood effect in solving the nonseparable problem but it cannotsolve the imbalance problem well and it still needs to beimproved in the accuracy of the solution In addition in themethod based on nondecomposition strategy although theperformance of many algorithms has been greatly improvedbut the research on hybrid algorithms is still rare Thereforethis paper combines three genetic operators of selectioncrossover and mutation with GWO algorithm and proposesa Hybrid Genetic Grey Wolf Algorithm (HGGWA) Themain improvement strategies of the algorithm are as fol-lows (1) initialize the population based on the Opposition-Based Learning strategy (2) the nonlinear adjustment ofparameters effectively balances exploration and exploitationcapabilities while accelerating the convergence speed of thealgorithm (3) select the population through a combinationof elite retention strategy and gambling roulette (4) basedon the method of dimensionality reduction and populationdivision the whole population is divided into multiplesubpopulations for cross-operation to increase the diversityof the population (5) perform mutation operations on eliteindividuals in the population to prevent the algorithm fromfalling into local optimum

3 Grey Wolf Optimization Algorithm

31 Mathematical Models In the GWO algorithm [12] theposition of 120572 wolf 120573 wolf and 120575 wolf is the fittest solutionthe second-best solution and the third-best solution respec-tively And the positions of the 120596 wolves are the remainingcandidate solutions In the whole process of searching forprey in the space the 120596 wolves gradually update theirpositions according to the optimal position of the 120572 120573 and 120575wolves So they can gradually approach and capture the preythat is to complete the process of searching for the optimalsolution The position updating imaginary diagram in GWOis shown in Figure 1

The grey wolves position update formula in this processis as follows 997888rarr119863 = 100381610038161003816100381610038161003816997888rarr119862 lowast 997888997888rarr119883119901 (119905) minus 997888rarr119883 (119905)100381610038161003816100381610038161003816 (1)

997888rarr119883 (119905 + 1) = 997888997888rarr119883119901 (119905) minus 997888rarr119860 lowast 997888rarr119863 (2)

where 997888rarr119863 indicates the distance between the grey wolfindividual and the prey 119905 indicates the current iteration 997888rarr119860and 997888rarr119862 are the coefficient vectors 997888rarr119860 is a convergence coef-ficient used to balance the exploration and the exploitation997888rarr119862 is used to simulate the effects of nature 997888997888rarr119883119901 is the positionvector of the prey and 997888rarr119883 indicates the position vector of agrey wolf

The coefficient vectors 997888rarr119860 and 997888rarr119862 are calculated as follows997888rarr119860 = 2997888rarr119886 lowast 997888rarr1199031 minus 997888rarr119886 (3)

997888rarr119862 = 2 lowast 997888rarr1199032 (4)

where components of 997888rarr119886 are linearly decreased from 2to 0 over the course of iterations and 1199031 and 1199032 are randomvectors in [0 1] At the same time the fluctuation range ofthe coefficient vector 997888rarr119860 gradually decreases as 997888rarr119886 decreases

Thus according to (1) and (2) the grey wolf individ-ual randomly moves within the search space to graduallyapproach the optimal solution The specific location updateformula is as follows 997888997888rarr119863120572 = 100381610038161003816100381610038161003816997888rarr1198621 lowast 997888997888rarr119883120572 minus 997888rarr119883100381610038161003816100381610038161003816 (5)

997888997888rarr119863120573 = 1003816100381610038161003816100381610038161003816997888rarr1198622 lowast 997888997888rarr119883120573 minus 997888rarr1198831003816100381610038161003816100381610038161003816 (6)

997888997888rarr119863120575 = 1003816100381610038161003816100381610038161003816997888rarr1198623 lowast 997888997888rarr119883120575 minus 997888rarr1198831003816100381610038161003816100381610038161003816 (7)

997888997888rarr1198831 = 997888997888rarr119883120572 minus 997888rarr1198601 lowast 997888997888rarr119863120572 (8)

997888997888rarr1198832 = 997888997888rarr119883120573 minus 997888rarr1198602 lowast 997888997888rarr119863120573 (9)

997888997888rarr1198833 = 997888997888rarr119883120575 minus 997888rarr1198603 lowast 997888997888rarr119863120575 (10)

997888rarr119883 (119905 + 1) = 997888997888rarr1198831 + 997888997888rarr1198832 + 997888997888rarr11988333 (11)

where997888997888rarr119863120572 997888997888rarr119863120573 997888997888rarr119863120575 indicate the distance between the 120572 120573120575 wolves and the prey respectively 997888997888rarr1198831 997888997888rarr1198832 997888997888rarr1198833 represent thepositional components of the remaining grey wolves basedon the positions of 120572 120573 and 120575 wolves respectively 997888rarr119883(119905 + 1)represents the position vector after the grey wolf is updated

32 Basic Process of GWO

Step 1 Randomly generate an initial population and initial-ize parameters 997888rarr119886 997888rarr119860 and 997888rarr119862 Step 2 Calculate the fitness of the grey wolf individual andsave the positions of the first three individuals with thehighest fitness as 997888rarr119883120572 997888rarr119883120573 and 997888rarr119883120575Step 3 Update the position information of each grey wolfaccording to (5) to (11) thereby obtaining the next generationpopulation and then update the values of parameters 997888rarr119886 997888rarr119860and 997888rarr119862 Step 4 Calculate the fitness of individuals in the new popu-lation and update 997888rarr119883120572 997888rarr119883120573 and 997888rarr119883120575Step 5 Repeat Steps 2ndash4 until the optimal solution isobtained or the maximum number of iterations is reached

4 Hybrid Genetic Grey Wolf Algorithm

This section describes the details of HGGWA the hybridalgorithm proposed in this paper Firstly the initial pop-ulation strategy based on Opposition-Based Learning is

6 Complexity

D

D

D

Move

C1 a1

C2 a2

a3C3

R

δ

or other wolves

The estimated position of the prey

Figure 1 Position updating in GWO

described in detail in Section 41 Secondly Section 42 intro-duces the nonlinear adjustment strategy for parameter 119886Then three operations of selection crossover and mutationare carried out in Sections 43ndash45 respectively Finally thetime complexity of HGGWA is analyzed in Section 46

41 Initial Population Strategy Based on Opposition-BasedLearning For the swarm intelligence optimization algo-rithm based on population iteration the diversity of initialpopulations lays the foundation for the efficiency of thealgorithm The better diversity of the population will reducethe computation time and improve the global convergence ofthe algorithm [64] However like other algorithms the GWOalgorithm also uses random initialization when generatingpopulations which will have a certain impact on the searchefficiency of the algorithm Opposition-Based Learning is astrategy proposed by Tizhoosh [65] to improve the efficiencyof algorithm search It uses the opposite points of known indi-vidual positions to generate new individual positions therebyincreasing the diversity of search populations Currently theOpposition-Based Learning strategy has been successfullyapplied to multiple swarm optimization algorithms [66 67]The opposite point is defined as follows

Assume that there is an individual 119883 (1199091 1199092 119909119863) inthe population P then the opposite point of the element119909119894 (119894 = 1 2 119863) on each dimension is 1199091015840119894 = 119897119894 + 119906119894 minus119909119894 where 119897119894 and 119906119894 are the lower bound and the upperof the 119894th dimension respectively According to the abovedefinition the Opposition-Based Learning strategy initializesthe population as follows

(a) Randomly initialize the population P calculate theopposite point 1199091015840119894 of each individual position 119909119894 and all theopposite points constitute the opposition population P1015840

(b) Calculate the fitness of each individual in the initialpopulation P and the opposition population P1015840 and arrangethem in descending order

(c) Select the top N grey wolf individuals with the highestfitness as the final initial population

42 Parameters Nonlinear Adjustment Strategy It is sig-nificantly important how to balance the exploration andexploitation for swarm intelligence algorithms In the earlyiteration of the algorithm the powerful exploration capabilityis beneficial for the algorithm to expand the search rangeso that the algorithm can search for the optimal solutionwith greater probability In the later stage of the algorithmrsquositeration the effective exploitation ability can speed up theoptimization process of the algorithm and improve the con-vergence accuracy of the final solution Therefore only whenthe swarm optimization algorithm can better coordinate itsglobal exploration and local exploitation capabilities can ithave strong robustness and faster convergence speed

It can be seen from (2) and (3) that the dynamic changeof the parameter 119886 plays a crucial role in the algorithm Inthe basic GWO the linear decrement strategy of parameter119886 does not reflect the actual convergence process of the algo-rithm Therefore the algorithm does not balance the explo-ration and exploitation capabilities well especially when solv-ing large-scale multimodal function problems In order todynamically adjust the global exploration and local exploita-tion process of the algorithm this paper proposes a nonlinearadjustment strategy for the parameter a which is beneficialforHGGWA to search for the optimal solutionThe improvedparameter nonlinear adjustment equation is as follows

119886 = 119886119898119886119909 + (119886119898119894119899 minus 119886119898119886119909) sdot ( 11 + 119890119905119872119886119909119894119905119890119903 )119896

(12)

where 119886119898119886119909and 119886119898119894119899 are the initial value and the end valueof the parameter 119886 119905 is the current iteration index 119872119886119909119894119905119890119903is the maximum number of iterations and 119896 is nonlinearadjustment coefficient

Thus the variation range of the convergence factor Ais controlled by the nonlinear adjustment of the parameter119886 When |A| gt 1 the grey wolf population expandsthe search range to find better prey which corresponds tothe global exploration of the algorithm when |A| lt 1

Complexity 7

the whole population shrinks the search range thus anencirclement is formed around the prey to complete the finalattacking against the prey which corresponds to the localexploitation process of the algorithm The whole process hasa positive effect on balancing global search and local searchwhich is beneficial to improve the accuracy of the solutionand further accelerate the convergence speed of the algo-rithm

43 Selection Operation Based on Optimal Retention Forthe intelligent optimization algorithm based on populationevolution the evolution of each generation of populationdirectly affects the optimization effect of the algorithm Inorder to inherit the superior individuals in the paternalpopulation to the next generation without being destroyedit is necessary to preserve the good individuals directly Theoptimal retention selection strategy is an effective way topreserve good individuals in genetic algorithms This paperintegrates them into the GWO to improve the efficiencyof the algorithm Assuming that the current population isP(1198831 1198832 119883119863) the fitness of the individual 119883119894 is 119865119894 Thespecific operation is as follows

(a) Calculate the fitness 119865119894 (i = 1 2 D) of eachindividual and arrange them in descending order

(b) Selecting the individuals with the highest fitnessto copy directly into the next generation of popula-tion

(c) Calculate the total fitness S of the remaining indi-viduals and the probability 119901119894 that each individual isselected

S = 119863minus1sum119894=1

119865119894 (119894 = 1 2 119863 minus 1) (13)

119901119894 = 119865119894sum119863minus1119894=1 119865119894 (119894 = 1 2 119863 minus 1) (14)

(d) Calculate the cumulative fitness value 119904119894 for eachindividual and then the selection operation is performed inthemanner of the bet roulette until the number of individualsin the children population is consistent with the parentpopulation

119904119894 = sum119894119894=1 119865119894S(119894 = 1 2 119863 minus 1) (15)

44 Crossover Operation Based on Population Partition-ing Mechanism Due to the decrease of population diver-sity in the late evolution the GWO algorithm is easy tofall into local optimum for solving the large-scale high-dimensional optimization problem in order to overcomethe problems caused by large-scale and complexity andto ensure that the algorithm searches for all solutionspaces

In the HGGWA algorithm the whole population P isdivided into 120592 times 120578 subpopulations 119875119894119895 (119894 = 1 sdot sdot sdot 120592 119895 =1 120578) The optimal size of each subpopulation 119875119894119895 wastested to be 5 times 5 Individuals of each subpopulation were

cross-operated to increase the diversity of the populationThespecific division method is shown below

The initial population 119875[[[[[[[

11990911 11990912 sdot sdot sdot 119909111988911990921 11990922 sdot sdot sdot 1199092119889 d1199091198751 1199091198752 sdot sdot sdot 119909119875119889

]]]]]]]Subpopulations 119875119894119895

[[[[[

11990911 sdot sdot sdot 11990915 d11990951 sdot sdot sdot 11990955]]]]]sdot sdot sdot [[[[[

1199091(119889minus4) sdot sdot sdot 1199091119889 d1199095(119889minus4) sdot sdot sdot 1199095119889]]]]]11987511 sdot sdot sdot 1198751120578

[[[[[

119909(119875minus4)1 sdot sdot sdot 119909(119875minus4)5 d1199091198751 sdot sdot sdot 1199091198755

]]]]]sdot sdot sdot [[[[[

119909(119875minus4)(119889minus4) sdot sdot sdot 119909(119875minus4)119889 d119909119875(119889minus4) sdot sdot sdot 119909119875119889

]]]]]119875]1 sdot sdot sdot 119875]120578

(16)

In genetic algorithms cross-operation has a very impor-tant role and is the main way to generate new individuals Inthe improved algorithm of this paper each individual in thesubpopulation is cross-operated in a linear crossover mannerGenerating a corresponding random number 119903119894 isin(01) foreach individual 119909119894 in the subpopulation When the randomnumber 119903119894 is less than the crossover probability 119875119888 thecorresponding individual 119909119894 is paired for cross-operation

An example is as follows Crossover (p1 p2)(a) Generate a random number 120582 isin (0 1)(b) Two children 1198881(11988811 1198881119863) 1198882(11988821 1198882119863) are gener-

ated by two parents 1199011(11990111 1199011119863) and 1199012(11990121 1199012119863)1198881119894 = 1205821199011119894 + (1 minus 120582) 1199012119894 119894 = 1 2 119863 (17)

1198882119894 = 1205821199012119894 + (1 minus 120582) 1199011119894 119894 = 1 2 119863 (18)

45 Mutation Operation for Elite Individuals Due to theexistence of the selection operation based on the optimalpreservation strategy the grey wolf individuals in the wholepopulation are concentrated in a small optimal region inthe later stage of the iterative process which easily leadsto the loss of population diversity If the current optimalindividual is a locally optimal solution then the algorithm iseasy to fall into local optimum especially when solving high-dimensional multimodal functions To this end this paperintroduces the mutation operator in the HGGWA algorithmto perform mutation operations on elite individuals in thepopulation The specific operations are as follows

Assume that the optimal individual is 119909119894(1199091 1199092 119909119889)and the mutation operation is performed on 119909119894 with themutation probability 119875119898 That is select a gene 119909119896 from theoptimal individual with probability 119875119898 instead of the gene 119909119896with a random number between upper and lower bounds to

8 Complexity

(1) Initialize parameters a A C population size N 119875119888 and 119875119898(2) Initialize population 119875 using OBL the strategy(3) t=0(4) While tltMa119909119894119905119890119903(5) Calculate the fitness of all search agents(6) 119883120572= the best search agent(7) 119883120573= the second best search agent(8) 119883120575= the third best search agent(9) for i = 1N(10) Update the position of all search agents by equation (11)(11) end for(12) newP1larr997888P except the119883120572(13) for i = 1N-1(14) Generate newP2 by the Roulette Wheel Selection on newP1(15) end for(16) Plarr997888newP2 and119883120572(17) Generate subPs by the population partitioning mechanism on P(18) Select the individuals by crossover probability 119875119888(19) for i = 1Nlowast119875119888(20) Crossover each search agent by equation (17) and (18)(21) end for(22) Generate the1198831015840120572 1198831015840120573 and1198831015840120575 by equation (19) on mutation probability 119875119898(23) t = t+1(24) end while(25) Return 119883120572Algorithm 1 Pseudo code of the HGGWA

generate a new individual 1199091015840119894 = (11990910158401 11990910158402 1199091015840119889) The specificoperation is as follows

1199091015840119894 = 119897 + 120582 lowast (119906 minus 119897) 119894 = 119896119909119894 119894 = 119896 (19)

where 120582 is a random number in [0 1] and 119897 and 119906 are thelower and upper bounds of the individual 119909119894 respectively46 Pseudo Code of the HGGWA and Time ComplexityAnalysis This section describes how to calculate an upperbound for the total number of fitness evaluations (FE)required by HGGWA As shown in Algorithm 1 the compu-tational complexity of HGGWA in one generation is mainlydominated by the position updating of all search agents inline (10) The position updating requires a time complexityof O(Nlowastdimlowast119872119886119909119894119905119890119903)(N is the population size dim is thedimensions of each benchmark function and 119872119886119909119894119905119890119903 isthe maximum number of iterations of HGGWA) to obtainthe needed parameters and to finish the whole positionupdating progress of all search agents In addition theprocess of crossover operations is another time-consumingstep in line (20) It needs a time complexity of O(2lowast120592 lowast120578 lowast 119872119886119909119894119905119890119903)(120592 lowast 120578 is the total number of subpopulations120592=N5 120578=dim5) to complete the crossing of individuals ineach subpopulations according to the crossover probability119875119888 It should be noted that this is only the worst casecomputational complexity If there are only two individualsin each subpopulation that need to be crossed then theminimum amount of computation is O(120592 lowast 120578 lowast 119872119886119909119894119905119890119903)

Therefore the maximum computational complexity causedby the twodominant processes in the algorithm ismax 119874(Nlowast119889119894119898lowast119872119886119909119894119905119890119903) 119874(2lowast120592lowast120578lowast119872119886119909119894119905119890119903) (120592 = N5 120578 = 1198891198941198985)in one generation ie 119874(N lowast 119889119894119898 lowast119872119886119909119894119905119890119903)5 Numerical Experiments and Analysis

In this section the proposed HGGWA algorithm will beevaluated on both classical benchmark functions [68] andthe suite of test functions provided by CEC2008 SpecialSession on large-scale global optimization The algorithmsused for comparison include not only conventional EAs butalso other CC optimization algorithms Experimental resultsare provided to analyze the performance of HGGWA in thecontext of large-scale optimization problems In addition thesensitivity analysis of nonlinear adjustment coefficient andthe global convergence of HGGWA are discussed in Sections54 and 55 respectively

51 Benchmark Functions and Performance Measures Inorder to verify the ability ofHGGWAalgorithm to solve high-dimensional complex functions 10 general high-dimensionalbenchmark functions (100D 500D 1000D) were selected foroptimization test Among them 1198911 sim 1198916 are unimodalfunctions which are used to test the local searchability ofthe algorithm 1198917 sim 11989110 are multimodal functions which areused to test the global searchability of the algorithm Thesetest functions have multiple local optima points with unevendistribution nonconvexity and strong oscillation which arevery difficult to converge to the global optimal solutionespecially in the case of being high-dimensional The specific

Complexity 9

Table 2 10 high-dimensional benchmark functions

Functions Function formula Range 119891119898119894119899 Accuracy

Sphere 1198911 (119909) = 119889sum119894=1

1199092119894 [minus100 100] 0 1 times 10minus8Schwefelrsquos 222 1198912 (119909) = 119889sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816 + 119889prod119894=1

10038161003816100381610038161199091198941003816100381610038161003816 [minus10 10] 0 1 times 10minus8Schwefelrsquos 221 1198913 (119909) = max11989410038161003816100381610038161199091198941003816100381610038161003816 1 le 119894 le 119889 [minus100 100] 0 1 times 10minus2Rosenbrock 1198914 (119909) = 119889minus1sum

119894=1

[100 (119909119894+1 minus 1199092119894 )2 + (119909119894 minus 1)2] [minus30 30] 0 1 times 100Schwefelrsquos 12 1198915 (119909) = 119889sum

119894=1

( 119894sum119895=1

119909119895)2 [minus100 100] 0 1 times 10minus3Quartic 1198916 (119909) = 119889sum

119894=1

1198941199094119894 + 119903119886119899119889119900119898[0 1) [minus128 128] 0 1 times 10minus4Rastrigin 1198917 (119909) = 119889sum

119894=1

[1199092119894 minus 10 cos (2120587119909119894) + 10] [minus512 512] 0 1 times 10minus8Ackley 1198918 = minus20 exp(minus02radic 1119889

119889sum119894=1

1199092119894) minus exp(1119889119889sum119894=1

cos (2120587119909119894)) + 20 + 119890 [minus32 32] 0 1 times 10minus5Griewank 1198919 = 14000

119889sum119894=1

1199092119894 minus 119889prod119894=1

cos( 119909119894radic119894) + 1 [minus600 600] 0 1 times 10minus5

Penalized 1

11989110 = 120587119889 10 sin (1205871199101) +119889minus1sum119894=1

(119910119894 minus 1)2 [1 + sin2 (120587119910119894+1)] + (119910119899 minus 1)2

[minus50 50] 0 1 times 10minus2+ 119889sum119894=1

119906 (119909119894 10 100 4)119910119894 = 1 + 119909119894 + 14

119906(119909119894 119886 119896119898)

119896(119909119894 minus 119886)119898 119909119894 gt 1198860119896(minus119909119894 minus 119886)119898 119909119894 lt minus119886

characteristics of the 10 benchmark functions are shown inTable 2

In order to show the optimization effect of the algorithmmore clearly this paper selects two indicators to evaluate theperformance of the algorithm [69] The first is the solutionaccuracy (AC) which reflects the difference between theoptimal results of the algorithm and the theoretical optimalvalue In an experiment if the final convergence result ofthe algorithm is less than the AC then the optimization isconsidered successful Assuming that the optimal value of acertain optimization is119883119900119901119905 and the theoretical optimal valueis119883119898119894119899 then the AC is calculated as follows

119860119862 = 10038161003816100381610038161003816119883119900119901119905 minus 11988311989811989411989910038161003816100381610038161003816 (20)

The second is the successful ratio (SR) which reflectsthe proportion of the number of successful optimizationsto the total number of optimizations under the conditionthat the algorithm has a certain accuracy Assume that inthe 119879 iterations of the algorithm the number of successfuloptimizations is 119905 then the SR is calculated as follows

119878119877 = 119905119879 times 100 (21)

52 Results on Classical Benchmark Functions In this sec-tion a set of simulation tests were performed in order toverify the effectiveness of HGGWA Firstly the 10 high-dimensional benchmark functions in Table 2 are optimizedbyHGGWAalgorithm proposed in this paper and the resultsare compared with WOA SSA and ALO In addition therunning time of several algorithms is compared under thegiven convergence accuracy

521 Parameters Settings for the Compared Algorithms In allexperiments the values of the common parameters such asthe maximum number of iterations (119872119886119909119894119905119890119903) the dimensionof the functions (D) and the population sizes (N) were cho-sen the same For all test problems we focus on investigatingthe optimization performance of the proposed method onproblemswithD= 100 500 and 1000Themaximumnumberof iterations is set to 1000 and the population size is set to50 The parameter setting of WOA SSA and ALO is derivedfrom the original literature [70ndash72] The optimal parametersettings of the HGGWA algorithm proposed in this paper areshown in Table 3

For each experiment of an algorithm on a benchmarkfunction 30 independent runs are performed to obtain a

10 Complexity

Table 3 Parameter settings of the HGGWA

Parameters Definition ValueN Population size 50119875119888 Cross probability 08119875119898 Mutation probability 001119872119886119909119894119905119890119903 Maximum number of iterations 1000119896 Nonlinear adjustment coefficient 05

fair comparison among the different algorithms The optimalvalue the worst value the average value and the standarddeviation of each algorithm optimization are recorded andthen the optimization successful ratio (SR) is calculatedAll programs were coded in Matlab 2017b (Win64) andexecuted on a Lenovo computer with Intel (R) Core I5-6300HQ 8G ROM 230GHz under Windows 10 operatingsystem

522 Comparison with State-of-the-Art Metaheuristic Algo-rithms In order to verify the efficiency of the HGGWAalgorithm several state-of-the-art metaheuristic algorithmsrecently proposed are compared with their optimizationresults These algorithms are Whale Optimization Algorithm(WOA) Salp Swarm Algorithm (SSA) and Ant Lion Opti-mization Algorithm (ALO) The test was performed usingthe same 10 high-dimensional functions in Table 2 and theresults of the comparison are shown in Table 4

From Table 4 it can be known that the convergenceaccuracy and optimization successful ratio of the HGGWAalgorithm are significantly higher than the other three algo-rithms for most of the test functions In the case of 100dimensions HGGWA algorithm can converge to the globaloptimal solution for the other nine test functions at one timein addition to function 3 And the algorithm has a successfulratio of 100 for the nine functions under the condition ofcertain convergence accuracy The convergence result of theWOA algorithm is better than the other three algorithms forfunctions 1198911 and 1198912 and its robustness is excellent With theincrease of function dimension the convergence precisionof several algorithms decreases slightly but the optimizationresults of HGGWA algorithm are better than those of WOASSA and ALO In general the HGGWA algorithm exhibitsbetter optimization effects than the WOA SSA and ALOalgorithms in the 100- 500- and 1000-dimensional testfunctions which proves the effectiveness of the HGGWAalgorithm for solving high-dimensional complex functions

Generally speaking the HGGWA algorithm performsbetter than other algorithms in most test functions Inorder to compare the convergence performance of the fouralgorithms more clearly the convergence curves of the fouralgorithms on Sphere Schwefelrsquos 222 Rastrigin GriewankQuartic and Ackley functions with D=100 are shown inFigure 2 Sphere and Schwefelrsquos 222 represent the two mostbasic unimodal functions Rastrigin and Griewank representthe two most basic multimodal functions It is seen thatHGGWA has higher precision and faster convergence speedthan other algorithms

523 Comparative Analysis of Running Time To evaluate theactual runtime of the several compared algorithms includingSSAWOAGWO andHGGWA their average running times(in seconds s) from 30 runs on function 1198911 1198912 1198916 1198917 1198918 1198919and 11989110 are plotted in Figure 3 The convergence accuracy ofeach test function has been described in Table 2

Figure 3 shows the convergence behavior of differentalgorithms Each point on the plot was calculated by takingthe average of 30 independent runs As can be seen fromFigure 3 several algorithms show different effects under thesame calculation accuracy For the functions 1198911 1198912 1198917 11989181198919 the GWO algorithm has the fastest convergence rateas a result of the advantages of the GWO algorithm itselfHowever for the test functions 1198916 and 11989110 the runningtime of the HGGWA algorithm is the shortest Throughfurther analysis it is known that HGGWA algorithm adds alittle computing time compared to GWO algorithm becauseof several improved strategies such as selection crossoverand mutation However the convergence speed of HGGWAalgorithm is still better than SSA and WOA algorithmsOverall the convergence speed of the HGGWA algorithmperforms better in several algorithms compared Moreoverthe running time of different test functions is very small andit is kept between 1s and 3s which shows that the HGGWAalgorithm has excellent stability

53 Results on CECrsquo2008 Benchmark Functions This sectionprovides an analysis of the effectiveness of HGGWA in termsof the CECrsquo2008 large-scale benchmark functions and acomparison with four powerful algorithms which have beenproven to be effective in solving large-scale optimizationproblems Experimental results are provided to analyze theperformance of HGGWA for large-scale optimization prob-lems

We tested our proposed algorithms with CECrsquo2008 func-tions for 100 500 and 1000 dimensions and the means ofthe best fitness value over 50 runs were recorded WhileF1 (Shifted Sphere) F4 (Shifted Rastrigin) and F6 (ShiftedAckley) are separable functions F2 (Schwefel Problem) F3(Shifted Rosenbrock) F5 (Shifted Griewank) and F7 (FastFractal) are nonseparable presenting a greater challenge toany algorithm that is prone to variable interactions Theperformance of HGGWA is compared with other algorithmsCCPSO2 [31] DEwSAcc [5] MLCC [38] and EPUS-PSO[49] for 100 500 and 1000 dimensions respectively Themaximum number of fitness evaluations (FEs) was calculatedby the following formula FEs = 5000timesD where D is thenumber of dimensions In order to reflect the fairness of thecomparison results the optimization results of the other fouralgorithms are directly derived from the original literatureThe specific comparison results are shown in Table 5

Table 5 shows the experimental results and the entriesshown in bold are significantly better than other algorithmsA general trend that can be seen is that HGGWA algorithmhas a promising optimization effect for most of the 7 func-tions

Generally speaking HGGWAoutperforms CCPSO2 on 6out of 7 functions with 100 and 500 dimensions respectivelyAmong them the result of HGGWA is slightly worse than

Complexity 11

Table4Com

paris

onresults

of10

high

-dim

ensio

nalb

enchmarkfunctio

nsby

WOASSA

ALO

and

HGGWA

Functio

nDim

WOA

SSA

ALO

HGGWA

Mean

Std

SRMean

Std

SRMean

Std

SRMean

Std

SR

119891 1100

577

E-96

469

E-96

100

101E-02

124E

-02

0304

E-01

231E-01

0485E-53

241E-52

100

500

724E

-93

298

E-94

100

324E+

04218E+

040

734E

+04

132E

+04

0835E-25

886E-25

100

1000

205

E-89

316E-90

100

119E

+05

104E

+05

0248E+

05212E+

050

175E

-17

273E-17

100

119891 2100

189E

-99

233E-99

100

885E+

00237E+

000

429E+

02325E+

020

401E-33

434E-33

100

500

138E

-99

216E-99

100

337E+

02217E+

020

231E+

03203E+

030

958E

-18

236E-18

100

1000

180E

-99

137E

-98

100

845E+

02343E+

020

325E+

04413E+

040

250E-11

166E

-11

100

119891 3100

384E+

01217E+

000

222E+

0110

5E+0

10

247E+

0113

4E+0

10

455E+

01369E+

010

500

932E

+01

532E+

010

327E+

01233E+

010

392E+

01234E+

010

722E

+01

501E+

000

1000

986E

+01

324E+

010

331E+

0112

8E+0

10

493E+

01399E+

010

831E+

01436E+

000

119891 4100

967E

+01

398E+

010

900E

+02

516E+

020

140E

+03

103E

+03

0682

E+01

125E

-01

100

500

495E+

02314E+

010

718E

+06

146E

+06

0209E+

0713

7E+0

70

367

E+02

823E-01

01000

991E+0

2248E+

020

298E+

0713

5E+0

70

152E

+08

100E

+07

0964

E+02

374E+

010

119891 5100

783E

+05

236E+

050

217E+

0410

3E+0

40

360

E+04

216E+

040

345

E-05

277

E-05

100

500

198E

+07

132E

+07

0429E+

05316E+

050

118E

+06

103E

+06

0267

E-03

438

E-03

80

1000

668E+

07344

E+07

0248E+

0612

3E+0

60

372E+

06349E+

060

833E+

00512E+

000

119891 6100

265E-04

143E

-04

100

112E

+00

103E

+00

0828E-01

268E-01

0234

E-09

317E-10

100

500

394E-04

322E-04

60

497E+

01366

E+01

0113E

+02

101E+0

20

573E-05

433E-06

100

1000

216E-03

136E

-03

0393E+

02278E+

020

177E

+03

233E+

030

108E

-04

162E

-03

80

119891 7100

000

E+00

000

E+00

100

627E+

01436E+

010

354E+

02332E+

020

000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

204

E+03

197E

+03

0334E+

03456E+

030

000

E+00

000

E+00

100

1000

000

E+00

000

E+00

100

569E+

0316

6E+0

30

719E

+03

214E+

030

557E-11

315E-12

100

119891 8100

444

E-15

312E-15

100

515E+

00413E+

000

493E+

00349E+

000

215E-15

348

E-15

100

500

444

E-15

305

E-15

100

126E

+01

100E

+01

012

8E+0

1228E+

010

874E-12

539E-12

100

1000

888

E-16

288

E-15

100

128E

+01

144E

+01

014

6E+0

1203E+

010

159E

-07

163E

-07

100

119891 9100

000

E+00

000

E+00

100

152E

-01

244

E-01

0454E-01

316E-01

0000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

282E+

02213E+

020

392E+

0210

8E+0

20

184E

-16

844

E-17

100

1000

000

E+00

000

E+00

100

110E

+03

134E

+03

0361E+

03117E

+03

0231E-13

329E-14

100

119891 10100

816E-03

432E-03

100

110E

+01

219E+

010

174E

+01

133E

+01

0273E-07

311E-07

100

500

252E-02

199E

-02

80

516E+

01333E+

010

397E+

05296E+

050

812E-05

498

E-06

100

1000

143E

-02

103E

-02

80

100E

+05

213E+

040

696E+

0613

4E+0

60

633E-04

377

E-03

100

12 Complexity

WOASSAALOHGGWA

500 10000Number of iterations

10minus100

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(a) Sphere function

WOASSAALOHGGWA

500 10000Number of iterations

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(b) Schwefel 222 function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(c) Rastrigin function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(d) Griewank function

WOASSAALOHGGWA

200 400 600 800 10000Number of iterations

10minus5

100

105

Obj

ectiv

e fun

ctio

n va

lue

(e) Quartic function

WOASSAALOHGGWA

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

500 10000Number of iterations

(f) Ackley function

Figure 2 Convergence curves of WOA SSA ALO and HGGWA algorithms

Comparative analysis of running time

459

344

053

13

388

239

072

186

537

366

317

285

423

233

122

216

399

217

057

139

259

218

069

121

463

333352

195

0

1

2

3

4

5

6

Runn

ing

Tim

es (s

)

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

SSAWOAGWOHGGWA(a) Comparative analysis of running time

Comparative analysis of iterations

302

233

198221

334290

263237

723

614

491

407 398

258

365312 263

234201

175

214185

166149

186

85

298

132

SSAWOAGWOHGGWA

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

100

200

300

400

500

600

700

Num

ber o

f ite

ratio

ns

(b) Comparative analysis of iterations

Figure 3 The average running times and iterations of SSA WOA GWO and HGGWA on different test functions

Complexity 13

Table5Com

paris

onresults

of7CE

Crsquo200

8benchm

arkfunctio

nsby

fivea

lgorith

ms

Functio

nDim

CCP

SO2

DEw

SAcc

MLC

CEP

US-PS

OHGGWA

Mean

Std

Mean

Std

Mean

Std

Mean

Std

Mean

Std

F1100

773E

-14

323E-14

568E-14

000

E+00

682E-14

232E-14

747E

-01

170E

-01

306

E-17

293E-17

500

300

E-13

796E

-14

209E-09

461E-09

429E-13

331E-14

845E+

01640

E+00

275E-15

213E-15

1000

518E-13

961E-14

878E-03

527E-03

845E-13

500

E-14

553E+

02286E+

0110

6E-10

183E

-11

F2100

608E+

0078

3E+0

0825E+

00532E+

00252E+

01872E+

0018

6E+0

1226E+

00271E+

00239

E+00

500

579E+

01421E+

0175

7E+0

1304

E+00

666

E+01

569E+

00435E+

01551E-01

312E+

01491E+

001000

782E

+01

425E+

0196

0E+0

118

1E+0

010

8E+0

2475E+

00466

E+01

400

E-01

271E+

01566

E+00

F3100

423E+

02865E+

0214

5E+0

2584E+

0114

9E+0

2572E+

01499E+

03535E+

03981E+

01225E-01

500

724E

+02

154E

+02

181E+0

3274E+

0292

4E+0

217

2E+0

2577E+

04804

E+03

496

E+02

623E-01

1000

133E

+03

263E+

02914E

+03

126E

+03

179E

+03

158E

+02

837E+

0515

2E+0

5997

E+02

358

E+01

F4100

398E-02

199E

-01

437E+

0076

5E+0

0438E-13

921E-14

471E+

02594E+

01000

E+00

000

E+00

500

398E-02

199E

-01

364

E+02

523E+

0117

9E-11

631E-11

349E+

03112E

+02

000

E+00

000

E+00

1000

199E

-01

406

E-01

182E

+03

137E

+02

137E

-10

337E-10

758E

+03

151E+0

2493E-11

287

E-12

F5100

345E-03

488E-03

307E-14

786E

-15

341E-14

116E

-14

372E-01

560

E-02

000

E+00

000

E+00

500

118E

-03

461E-03

690E-04

241E-03

212E-13

247E-14

164E

+00

469E-02

216E-16

756E

-17

1000

118E

-03

327E-03

358E-03

573E-03

418E-13

278

E-14

589E+

00391E-01

549E-13

235E-14

F6100

144E

-13

306

E-14

113E

-13

153E

-14

111E-13

786E

-15

206

E+00

440

E-01

321E-15

248

E-15

500

534E-13

861E-14

480E-01

573E-01

534

E-13

701E-14

664

E+00

449E-01

674E-12

439E-12

1000

102E

-12

168E

-13

229E+

00298E-01

106E

-12

768E

-14

189E

+01

249E+

00239E-07

213E-07

F7100

-150E

+03

104E

+01

-137

E+03

246

E+01

-154E

+03

252E+

00-855E

+02

135E

+01

-700e+0

2664

E-01

500

-723E

+03

416E+

01-574

E+03

183E

+02

-743E

+03

803E+

00-351E+0

3210E+

01-883e+0

3354

E+00

1000

-143E

+04

827E+

01-105E

+04

418E+

02-147E

+04

151E+0

1-662E

+03

318E+

01-369E

+04

216E+

02

14 Complexity

CCPSO2 for function F7 in the case of 100-dimensionalthe two algorithms obtain similar optimization results forthe function F6 in the case of 500-dimensional In additionat first glance it might seem that HGGWA and MLCChave similar performance However the HGGWA achievesbetter convergence accuracy than MLCC under the samenumber of iterations In particular the HGGWA algorithmrsquosoptimization results are superior toDEwSAcc andEPUS-PSOin all dimensions The result of HGGWA scaled very wellfrom 100-D to 1000-D on F1 F4 F5 and F6 outperformingDEwSAcc and EPUS-PSO on all the cases

54 Sensitivity Analysis of Nonlinear Adjustment Coefficient 119896This section mainly discusses the effect of nonlinear adjust-ment coefficients on the convergence performance of thealgorithm In the HGGWA algorithm the role of parameter119886 is to balance global exploration capabilities and localexploitation capabilities In (12) the nonlinear adjustmentcoefficient 119896 is a key parameter and is mainly used to controlthe range of variation of the convergence factor Thereforewe selected four different values for numerical experimentsto analyze the effect of nonlinear adjustment coefficients onthe performance of the algorithm The four different valuesare 051152 and the results of the comparison are shown inTable 6 Among them black bold represents the best result ofsolving in several algorithms

In general the value of the nonlinear adjustment coef-ficient 119896 has no significant effect on the performance ofthe HGGWA algorithm On closer inspection one can seethat besides the function 1198913 the optimization performanceof the algorithm is optimal when the nonlinear adjustmentcoefficient 119896 is 05 But for function 1198913 119896 = 15 is the bestchoice For functions 1198917 and 1198919 the HGGWA algorithmobtains the optimal solution 0 in the case of several differentvalues of the coefficient 119896 Therefore for most test functionsthe optimal value of the nonlinear adjustment factor 119896 is 0555 Global Convergence Analysis of HGGWA In theHGGWA algorithm the following four improved strategiesare used to ensure the global convergence of the algorithm(1) initial population strategy based on Opposition-BasedLearning (2) selection operation based on optimal retention(3) crossover operation based on population partitioningmechanism and (4) mutation operation for elite individ-uals

The idea of solving the high-dimensional function opti-mization problem byHGGWAalgorithm is as follows Firstlyfor the initial population strategy the basic GWO generatesthe initial population in a random manner However it willhave a great impact on the search efficiency of the algo-rithm for high-dimensional function optimization problemsThe algorithm cannot effectively search the entire problemsolution space if the initial grey wolf population is clusteredin a small range Therefore the Opposition-Based Learningstrategy is used to initialize the population which can makethe initial population evenly distributed in the solution spacethus laying a good foundation for the global search of thealgorithm Secondly the optimal retention strategy is usedto preserve the optimal individual of the parent population

in the process of population evolution As a result the nextgeneration of population can evolve in the optimal directionIn addition the individuals with higher fitness are selectedby the gambling roulette operation which maintains thedominance relationship between individuals in the popula-tion This dominance relationship makes the algorithm withgood global convergence Thirdly the crossover operation iscompleted under the condition of population partitioningin order to achieve the purpose of dimension reductionand maintain the diversity of the population Finally thesearch direction of the grey wolf population is guided by themutation operation of the elite individuals which effectivelyprevents the algorithm from falling into the local optimalsolution All in all through the improvement of these fourdifferent strategies the HGGWA algorithm shows goodglobal convergence in solving high-dimensional functionoptimization problems

6 Conclusion

In order to overcome the shortcomings of GWO algorithmfor solving large-scale global optimization problems whichare easy to fall into local optimum this paper proposes aHybrid Genetic GreyWolf Algorithm (HGGWA) by integrat-ing three genetic operators into the algorithm The improvedalgorithm initializes the population based on theOpposition-Based Learning strategy for improving the search efficiencyof the algorithm The strategy of population division basedon cooperative coevolution reduces the scale of the problemand the mutation operation of the elite individuals effectivelyprevents the algorithm from falling into the local optimaTheperformance of HGGWA has been evaluated using 10 classi-cal benchmark functions and 7 CECrsquo2008 high-dimensionalfunctions Fromour experimental results several conclusionscan be drawn

The results have shown that it is capable of groupinginteracting variables with great accuracy for the major-ity of the benchmark functions A comparative studyamong the HGGWA and other state-of-the-art algorithmswas conducted and the experimental results showed thatthe HGGWA algorithm exhibits better global convergencewhether it is solving separable functions or nonseparablefunctions(1)By testing 10 classical benchmark functions comparedwith the results of WOA SSA and ALO the HGGWA algo-rithm has been greatly improved in convergence accuracy Inaddition to the Schwefel problem and Rosenbrock functionthe HGGWA algorithm achieves an 80-100 successfulratio for all benchmark functions with 100 dimensionswhich proves the effectiveness of HGGWA for solving high-dimensional functions(2) By comparing with the optimization results on sevenCECrsquo2008 large-scale global optimization problems theresults show that HGGWA algorithm achieves the globaloptimal value for the separable functions F1 F4 and F6but the effect on other nonseparable problems is not verysatisfactory However HGGWAalgorithm still exhibits betterglobal convergence than the other algorithms CCPSO2DewSAcc MLCC and EPUS-PSO on most of the functions

Complexity 15

Table6Com

paris

onof

optim

izationperfo

rmance

ofdifferent119896val

uesfor

HGGWA

Functio

n119896=0

5119896=1

119896=15

119896=2Mean

Std

Mean

Std

Mean

Std

Mean

Std

119891 1843

E-60

536E-60

225E-54

203E-54

558E-53

431E-53

231E-51

316E-51

119891 2289

E-34

231E-34

206

E-33

189E

-33

470E-32

365E-32

576E-30

422E-30

119891 3111E+0

010

2E+0

0314E+

01219E+

01403E-01

238E-01

360

E+01

216E+

01119891 4

978

E+01

836E+

0198

2E+0

174

3E+0

198

1E+0

173

4E+0

198

3E+0

1461E+

01119891 5

218E-03

134E

-03

887E-02

667E-02

781E+0

0613E+

00174E

-02

234E-02

119891 613

1E-03

113E

-03

162E

-03

243E-04

299E-03

104E

-03

184E

-03

134E

-03

119891 7000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 815

8E-14

123E

-15

293E-14

137E

-14

222E-14

149E

-14

251E-14

108E

-14

119891 9000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 10299

E-02

249E-02

338E-02

124E

-03

842E-02

614E-02

729E

-02

813E-03

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 4: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

4 Complexity

a new cooperative coevolution framework that is capableof optimizing large-scale nonseparable problems It uses arandom grouping strategy to divide the problem solutionspace into multiple subspaces of different sizes Li et al[24] embedded a random grouping strategy and an adaptiveweighting mechanism into the particle swarm optimizationalgorithm and proposed a cooperative coevolutionary parti-cle swarm optimization (CCPSO) algorithm The results ofthe randomgroupingmethod show the effective performanceto solve the scalable nonseparable benchmark function (upto 1000D) However as the number of interaction variablesincreases its performance becomes invalid [25] Thereforesome scholars consider adding a priori knowledge of theproblem in the process of algorithm evolution and proposemany learning-based dynamic grouping strategies Ray etal [26] discussed the effects of problem size the numberof subpopulations and the number of iterations on someseparable and nonseparable problems in cooperative coevo-lutionary algorithms Then a Cooperative CoevolutionaryAlgorithm with Correlation based Adaptive Variable Par-titioning (CCEA-AVP) is proposed to deal with scalablenonseparable problems Chen et al [27] proposed a CCmethodwithVariable Interaction Learning (CCVIL) which isable to adaptively change group sizes Considering that thereis no prior knowledge that cannot decompose the problemOmidvar et al [28] proposed an automatic decompositionstrategy which can keep the correlation between decisionvariables at the minimum

22 Nondecomposition Strategy The other is nondecompo-sition methods Different from the decomposition-basedstrategy the nondecomposition methods mainly study thealgorithm itself and improve the corresponding opera-tor in order to improve the performance of the algo-rithm for solving large-scale global optimization problemsNondecomposition-based methods mainly include swarmintelligence evolutionary computation and local search-based approaches Hsieh et al [49] added variable particlesand information sharing mechanism to the basic particleswarm optimization algorithm and proposed a variationcalled the Efficient PopulationUtilization Strategy for ParticleSwarm Optimizer (EPUS-PSO) A modified PSO algorithmwith a new velocity updating method as the rotated particleswarm was proposed in literature [50] It transforms coordi-nates and uses information from other dimensions to main-tain the diversity of each dimension Fan et al [51] proposeda novel particle swarm optimization approach with dynamicneighborhood that is based on kernel fuzzy clustering andvariable trust region methods (called FT-DNPSO) for large-scale optimization In terms of evolutionary computationHedar et al [52] modified genetic algorithm with newstrategies of population partitioning and space reductionfor high-dimensional problems In order to improve theperformance of differential evolution algorithm Zhang et al[53] proposed an adaptive differential evolution algorithm(called JADE) using a new mutation and adaptive controlparameter strategy Local search-based approaches have beenrecognized as an effective algorithm framework for solvingoptimization problems Hvattum et al [54] introduced a new

direct search method based on Scatter Search designed toremedy the lack of a good derivative-free method for solvingproblems of high dimensions Liu et al [55] improved thelocal search depth parameter in the memetic algorithm andproposed an adaptive local search depth (ALSD) strategywhich can arrange the local search computing resourcesaccording to its performance dynamically

23 Related Work of GWO Grey Wolf Optimizer (GWO) isa recently proposed swarm intelligence algorithm inspired bythe social hierarchy and hunting behavior of wolves It has theadvantages of less control parameters high solution accuracyand fast convergence speed Compared with other classicalmetaheuristic algorithms such as genetic algorithm (GA)particle swarm optimization (PSO) and gravitational searchalgorithm (GSA) the GWO shows powerful explorationcapability and better convergence characteristics Owingto its simplicity and ease of implementation GWO hasgained significant attention and has been applied in solv-ing many practical optimization problems since its inven-tion

Recently many scholars have applied the GWO to differ-ent optimization problems and also proposed many varietiesin order to improve the performance of GWO Saremi et al[56] introduced a dynamic population updating mechanismin the GWO removing individuals with poor fitness inthe population and regenerating new individuals to replacethem in order to enhance the exploration ability of GWOHowever the diversity of grey population was reduced by thisstrategy which leads to the algorithm being local optimumZhu et al [57] integrated the idea of differential evolutioninto the GWO which prevented the early stagnation ofthe GWO algorithm and accelerated the convergence speedHowever it fails to converge in high-dimensional problemsKumar et al [58] designed a novel variant of GWO fornumerical optimization and engineering design problemsThey utilized the preyweight and astrophysics-based learningconcept to enhance the exploration ability However thismethod also does not show good performance in solvinghigh-dimensional function problems In the application ofthe GWO algorithm Guha et al [59] applied the grey wolfalgorithm to the load control problem in large-scale powersystems In literature [60] a multiobjective discrete grey wolfalgorithm is proposed for the uniqueness of welding jobscheduling problem The results show that the algorithm cansolve the scheduling problem in practical engineering Emaryet al [61] proposed a new binary grey wolf optimizationalgorithm for the selection of the best feature subset

The swarm intelligence optimization algorithm has natu-ral parallelism so it has advantages in solving large-scale opti-mization problems In order to study the ability of grey wolfoptimization algorithm to solve high-dimensional functionsLong et al [62] proposed a nonlinear adjustment strategy forconvergence factors which balanced well the exploration andexploitation of the algorithm But this strategy also does notsolve the high-dimensional function optimization problemwell In 2017 Gupta et al [63] studied the performance ofthe GWO algorithm for solving 100 to 1000-dimensionalfunction optimization problems The results indicate that

Complexity 5

GWO is a powerful nature-inspired optimization algorithmfor large-scale problems except Rosenbrock function

In summary the decomposition-based CC strategy has agood effect in solving the nonseparable problem but it cannotsolve the imbalance problem well and it still needs to beimproved in the accuracy of the solution In addition in themethod based on nondecomposition strategy although theperformance of many algorithms has been greatly improvedbut the research on hybrid algorithms is still rare Thereforethis paper combines three genetic operators of selectioncrossover and mutation with GWO algorithm and proposesa Hybrid Genetic Grey Wolf Algorithm (HGGWA) Themain improvement strategies of the algorithm are as fol-lows (1) initialize the population based on the Opposition-Based Learning strategy (2) the nonlinear adjustment ofparameters effectively balances exploration and exploitationcapabilities while accelerating the convergence speed of thealgorithm (3) select the population through a combinationof elite retention strategy and gambling roulette (4) basedon the method of dimensionality reduction and populationdivision the whole population is divided into multiplesubpopulations for cross-operation to increase the diversityof the population (5) perform mutation operations on eliteindividuals in the population to prevent the algorithm fromfalling into local optimum

3 Grey Wolf Optimization Algorithm

31 Mathematical Models In the GWO algorithm [12] theposition of 120572 wolf 120573 wolf and 120575 wolf is the fittest solutionthe second-best solution and the third-best solution respec-tively And the positions of the 120596 wolves are the remainingcandidate solutions In the whole process of searching forprey in the space the 120596 wolves gradually update theirpositions according to the optimal position of the 120572 120573 and 120575wolves So they can gradually approach and capture the preythat is to complete the process of searching for the optimalsolution The position updating imaginary diagram in GWOis shown in Figure 1

The grey wolves position update formula in this processis as follows 997888rarr119863 = 100381610038161003816100381610038161003816997888rarr119862 lowast 997888997888rarr119883119901 (119905) minus 997888rarr119883 (119905)100381610038161003816100381610038161003816 (1)

997888rarr119883 (119905 + 1) = 997888997888rarr119883119901 (119905) minus 997888rarr119860 lowast 997888rarr119863 (2)

where 997888rarr119863 indicates the distance between the grey wolfindividual and the prey 119905 indicates the current iteration 997888rarr119860and 997888rarr119862 are the coefficient vectors 997888rarr119860 is a convergence coef-ficient used to balance the exploration and the exploitation997888rarr119862 is used to simulate the effects of nature 997888997888rarr119883119901 is the positionvector of the prey and 997888rarr119883 indicates the position vector of agrey wolf

The coefficient vectors 997888rarr119860 and 997888rarr119862 are calculated as follows997888rarr119860 = 2997888rarr119886 lowast 997888rarr1199031 minus 997888rarr119886 (3)

997888rarr119862 = 2 lowast 997888rarr1199032 (4)

where components of 997888rarr119886 are linearly decreased from 2to 0 over the course of iterations and 1199031 and 1199032 are randomvectors in [0 1] At the same time the fluctuation range ofthe coefficient vector 997888rarr119860 gradually decreases as 997888rarr119886 decreases

Thus according to (1) and (2) the grey wolf individ-ual randomly moves within the search space to graduallyapproach the optimal solution The specific location updateformula is as follows 997888997888rarr119863120572 = 100381610038161003816100381610038161003816997888rarr1198621 lowast 997888997888rarr119883120572 minus 997888rarr119883100381610038161003816100381610038161003816 (5)

997888997888rarr119863120573 = 1003816100381610038161003816100381610038161003816997888rarr1198622 lowast 997888997888rarr119883120573 minus 997888rarr1198831003816100381610038161003816100381610038161003816 (6)

997888997888rarr119863120575 = 1003816100381610038161003816100381610038161003816997888rarr1198623 lowast 997888997888rarr119883120575 minus 997888rarr1198831003816100381610038161003816100381610038161003816 (7)

997888997888rarr1198831 = 997888997888rarr119883120572 minus 997888rarr1198601 lowast 997888997888rarr119863120572 (8)

997888997888rarr1198832 = 997888997888rarr119883120573 minus 997888rarr1198602 lowast 997888997888rarr119863120573 (9)

997888997888rarr1198833 = 997888997888rarr119883120575 minus 997888rarr1198603 lowast 997888997888rarr119863120575 (10)

997888rarr119883 (119905 + 1) = 997888997888rarr1198831 + 997888997888rarr1198832 + 997888997888rarr11988333 (11)

where997888997888rarr119863120572 997888997888rarr119863120573 997888997888rarr119863120575 indicate the distance between the 120572 120573120575 wolves and the prey respectively 997888997888rarr1198831 997888997888rarr1198832 997888997888rarr1198833 represent thepositional components of the remaining grey wolves basedon the positions of 120572 120573 and 120575 wolves respectively 997888rarr119883(119905 + 1)represents the position vector after the grey wolf is updated

32 Basic Process of GWO

Step 1 Randomly generate an initial population and initial-ize parameters 997888rarr119886 997888rarr119860 and 997888rarr119862 Step 2 Calculate the fitness of the grey wolf individual andsave the positions of the first three individuals with thehighest fitness as 997888rarr119883120572 997888rarr119883120573 and 997888rarr119883120575Step 3 Update the position information of each grey wolfaccording to (5) to (11) thereby obtaining the next generationpopulation and then update the values of parameters 997888rarr119886 997888rarr119860and 997888rarr119862 Step 4 Calculate the fitness of individuals in the new popu-lation and update 997888rarr119883120572 997888rarr119883120573 and 997888rarr119883120575Step 5 Repeat Steps 2ndash4 until the optimal solution isobtained or the maximum number of iterations is reached

4 Hybrid Genetic Grey Wolf Algorithm

This section describes the details of HGGWA the hybridalgorithm proposed in this paper Firstly the initial pop-ulation strategy based on Opposition-Based Learning is

6 Complexity

D

D

D

Move

C1 a1

C2 a2

a3C3

R

δ

or other wolves

The estimated position of the prey

Figure 1 Position updating in GWO

described in detail in Section 41 Secondly Section 42 intro-duces the nonlinear adjustment strategy for parameter 119886Then three operations of selection crossover and mutationare carried out in Sections 43ndash45 respectively Finally thetime complexity of HGGWA is analyzed in Section 46

41 Initial Population Strategy Based on Opposition-BasedLearning For the swarm intelligence optimization algo-rithm based on population iteration the diversity of initialpopulations lays the foundation for the efficiency of thealgorithm The better diversity of the population will reducethe computation time and improve the global convergence ofthe algorithm [64] However like other algorithms the GWOalgorithm also uses random initialization when generatingpopulations which will have a certain impact on the searchefficiency of the algorithm Opposition-Based Learning is astrategy proposed by Tizhoosh [65] to improve the efficiencyof algorithm search It uses the opposite points of known indi-vidual positions to generate new individual positions therebyincreasing the diversity of search populations Currently theOpposition-Based Learning strategy has been successfullyapplied to multiple swarm optimization algorithms [66 67]The opposite point is defined as follows

Assume that there is an individual 119883 (1199091 1199092 119909119863) inthe population P then the opposite point of the element119909119894 (119894 = 1 2 119863) on each dimension is 1199091015840119894 = 119897119894 + 119906119894 minus119909119894 where 119897119894 and 119906119894 are the lower bound and the upperof the 119894th dimension respectively According to the abovedefinition the Opposition-Based Learning strategy initializesthe population as follows

(a) Randomly initialize the population P calculate theopposite point 1199091015840119894 of each individual position 119909119894 and all theopposite points constitute the opposition population P1015840

(b) Calculate the fitness of each individual in the initialpopulation P and the opposition population P1015840 and arrangethem in descending order

(c) Select the top N grey wolf individuals with the highestfitness as the final initial population

42 Parameters Nonlinear Adjustment Strategy It is sig-nificantly important how to balance the exploration andexploitation for swarm intelligence algorithms In the earlyiteration of the algorithm the powerful exploration capabilityis beneficial for the algorithm to expand the search rangeso that the algorithm can search for the optimal solutionwith greater probability In the later stage of the algorithmrsquositeration the effective exploitation ability can speed up theoptimization process of the algorithm and improve the con-vergence accuracy of the final solution Therefore only whenthe swarm optimization algorithm can better coordinate itsglobal exploration and local exploitation capabilities can ithave strong robustness and faster convergence speed

It can be seen from (2) and (3) that the dynamic changeof the parameter 119886 plays a crucial role in the algorithm Inthe basic GWO the linear decrement strategy of parameter119886 does not reflect the actual convergence process of the algo-rithm Therefore the algorithm does not balance the explo-ration and exploitation capabilities well especially when solv-ing large-scale multimodal function problems In order todynamically adjust the global exploration and local exploita-tion process of the algorithm this paper proposes a nonlinearadjustment strategy for the parameter a which is beneficialforHGGWA to search for the optimal solutionThe improvedparameter nonlinear adjustment equation is as follows

119886 = 119886119898119886119909 + (119886119898119894119899 minus 119886119898119886119909) sdot ( 11 + 119890119905119872119886119909119894119905119890119903 )119896

(12)

where 119886119898119886119909and 119886119898119894119899 are the initial value and the end valueof the parameter 119886 119905 is the current iteration index 119872119886119909119894119905119890119903is the maximum number of iterations and 119896 is nonlinearadjustment coefficient

Thus the variation range of the convergence factor Ais controlled by the nonlinear adjustment of the parameter119886 When |A| gt 1 the grey wolf population expandsthe search range to find better prey which corresponds tothe global exploration of the algorithm when |A| lt 1

Complexity 7

the whole population shrinks the search range thus anencirclement is formed around the prey to complete the finalattacking against the prey which corresponds to the localexploitation process of the algorithm The whole process hasa positive effect on balancing global search and local searchwhich is beneficial to improve the accuracy of the solutionand further accelerate the convergence speed of the algo-rithm

43 Selection Operation Based on Optimal Retention Forthe intelligent optimization algorithm based on populationevolution the evolution of each generation of populationdirectly affects the optimization effect of the algorithm Inorder to inherit the superior individuals in the paternalpopulation to the next generation without being destroyedit is necessary to preserve the good individuals directly Theoptimal retention selection strategy is an effective way topreserve good individuals in genetic algorithms This paperintegrates them into the GWO to improve the efficiencyof the algorithm Assuming that the current population isP(1198831 1198832 119883119863) the fitness of the individual 119883119894 is 119865119894 Thespecific operation is as follows

(a) Calculate the fitness 119865119894 (i = 1 2 D) of eachindividual and arrange them in descending order

(b) Selecting the individuals with the highest fitnessto copy directly into the next generation of popula-tion

(c) Calculate the total fitness S of the remaining indi-viduals and the probability 119901119894 that each individual isselected

S = 119863minus1sum119894=1

119865119894 (119894 = 1 2 119863 minus 1) (13)

119901119894 = 119865119894sum119863minus1119894=1 119865119894 (119894 = 1 2 119863 minus 1) (14)

(d) Calculate the cumulative fitness value 119904119894 for eachindividual and then the selection operation is performed inthemanner of the bet roulette until the number of individualsin the children population is consistent with the parentpopulation

119904119894 = sum119894119894=1 119865119894S(119894 = 1 2 119863 minus 1) (15)

44 Crossover Operation Based on Population Partition-ing Mechanism Due to the decrease of population diver-sity in the late evolution the GWO algorithm is easy tofall into local optimum for solving the large-scale high-dimensional optimization problem in order to overcomethe problems caused by large-scale and complexity andto ensure that the algorithm searches for all solutionspaces

In the HGGWA algorithm the whole population P isdivided into 120592 times 120578 subpopulations 119875119894119895 (119894 = 1 sdot sdot sdot 120592 119895 =1 120578) The optimal size of each subpopulation 119875119894119895 wastested to be 5 times 5 Individuals of each subpopulation were

cross-operated to increase the diversity of the populationThespecific division method is shown below

The initial population 119875[[[[[[[

11990911 11990912 sdot sdot sdot 119909111988911990921 11990922 sdot sdot sdot 1199092119889 d1199091198751 1199091198752 sdot sdot sdot 119909119875119889

]]]]]]]Subpopulations 119875119894119895

[[[[[

11990911 sdot sdot sdot 11990915 d11990951 sdot sdot sdot 11990955]]]]]sdot sdot sdot [[[[[

1199091(119889minus4) sdot sdot sdot 1199091119889 d1199095(119889minus4) sdot sdot sdot 1199095119889]]]]]11987511 sdot sdot sdot 1198751120578

[[[[[

119909(119875minus4)1 sdot sdot sdot 119909(119875minus4)5 d1199091198751 sdot sdot sdot 1199091198755

]]]]]sdot sdot sdot [[[[[

119909(119875minus4)(119889minus4) sdot sdot sdot 119909(119875minus4)119889 d119909119875(119889minus4) sdot sdot sdot 119909119875119889

]]]]]119875]1 sdot sdot sdot 119875]120578

(16)

In genetic algorithms cross-operation has a very impor-tant role and is the main way to generate new individuals Inthe improved algorithm of this paper each individual in thesubpopulation is cross-operated in a linear crossover mannerGenerating a corresponding random number 119903119894 isin(01) foreach individual 119909119894 in the subpopulation When the randomnumber 119903119894 is less than the crossover probability 119875119888 thecorresponding individual 119909119894 is paired for cross-operation

An example is as follows Crossover (p1 p2)(a) Generate a random number 120582 isin (0 1)(b) Two children 1198881(11988811 1198881119863) 1198882(11988821 1198882119863) are gener-

ated by two parents 1199011(11990111 1199011119863) and 1199012(11990121 1199012119863)1198881119894 = 1205821199011119894 + (1 minus 120582) 1199012119894 119894 = 1 2 119863 (17)

1198882119894 = 1205821199012119894 + (1 minus 120582) 1199011119894 119894 = 1 2 119863 (18)

45 Mutation Operation for Elite Individuals Due to theexistence of the selection operation based on the optimalpreservation strategy the grey wolf individuals in the wholepopulation are concentrated in a small optimal region inthe later stage of the iterative process which easily leadsto the loss of population diversity If the current optimalindividual is a locally optimal solution then the algorithm iseasy to fall into local optimum especially when solving high-dimensional multimodal functions To this end this paperintroduces the mutation operator in the HGGWA algorithmto perform mutation operations on elite individuals in thepopulation The specific operations are as follows

Assume that the optimal individual is 119909119894(1199091 1199092 119909119889)and the mutation operation is performed on 119909119894 with themutation probability 119875119898 That is select a gene 119909119896 from theoptimal individual with probability 119875119898 instead of the gene 119909119896with a random number between upper and lower bounds to

8 Complexity

(1) Initialize parameters a A C population size N 119875119888 and 119875119898(2) Initialize population 119875 using OBL the strategy(3) t=0(4) While tltMa119909119894119905119890119903(5) Calculate the fitness of all search agents(6) 119883120572= the best search agent(7) 119883120573= the second best search agent(8) 119883120575= the third best search agent(9) for i = 1N(10) Update the position of all search agents by equation (11)(11) end for(12) newP1larr997888P except the119883120572(13) for i = 1N-1(14) Generate newP2 by the Roulette Wheel Selection on newP1(15) end for(16) Plarr997888newP2 and119883120572(17) Generate subPs by the population partitioning mechanism on P(18) Select the individuals by crossover probability 119875119888(19) for i = 1Nlowast119875119888(20) Crossover each search agent by equation (17) and (18)(21) end for(22) Generate the1198831015840120572 1198831015840120573 and1198831015840120575 by equation (19) on mutation probability 119875119898(23) t = t+1(24) end while(25) Return 119883120572Algorithm 1 Pseudo code of the HGGWA

generate a new individual 1199091015840119894 = (11990910158401 11990910158402 1199091015840119889) The specificoperation is as follows

1199091015840119894 = 119897 + 120582 lowast (119906 minus 119897) 119894 = 119896119909119894 119894 = 119896 (19)

where 120582 is a random number in [0 1] and 119897 and 119906 are thelower and upper bounds of the individual 119909119894 respectively46 Pseudo Code of the HGGWA and Time ComplexityAnalysis This section describes how to calculate an upperbound for the total number of fitness evaluations (FE)required by HGGWA As shown in Algorithm 1 the compu-tational complexity of HGGWA in one generation is mainlydominated by the position updating of all search agents inline (10) The position updating requires a time complexityof O(Nlowastdimlowast119872119886119909119894119905119890119903)(N is the population size dim is thedimensions of each benchmark function and 119872119886119909119894119905119890119903 isthe maximum number of iterations of HGGWA) to obtainthe needed parameters and to finish the whole positionupdating progress of all search agents In addition theprocess of crossover operations is another time-consumingstep in line (20) It needs a time complexity of O(2lowast120592 lowast120578 lowast 119872119886119909119894119905119890119903)(120592 lowast 120578 is the total number of subpopulations120592=N5 120578=dim5) to complete the crossing of individuals ineach subpopulations according to the crossover probability119875119888 It should be noted that this is only the worst casecomputational complexity If there are only two individualsin each subpopulation that need to be crossed then theminimum amount of computation is O(120592 lowast 120578 lowast 119872119886119909119894119905119890119903)

Therefore the maximum computational complexity causedby the twodominant processes in the algorithm ismax 119874(Nlowast119889119894119898lowast119872119886119909119894119905119890119903) 119874(2lowast120592lowast120578lowast119872119886119909119894119905119890119903) (120592 = N5 120578 = 1198891198941198985)in one generation ie 119874(N lowast 119889119894119898 lowast119872119886119909119894119905119890119903)5 Numerical Experiments and Analysis

In this section the proposed HGGWA algorithm will beevaluated on both classical benchmark functions [68] andthe suite of test functions provided by CEC2008 SpecialSession on large-scale global optimization The algorithmsused for comparison include not only conventional EAs butalso other CC optimization algorithms Experimental resultsare provided to analyze the performance of HGGWA in thecontext of large-scale optimization problems In addition thesensitivity analysis of nonlinear adjustment coefficient andthe global convergence of HGGWA are discussed in Sections54 and 55 respectively

51 Benchmark Functions and Performance Measures Inorder to verify the ability ofHGGWAalgorithm to solve high-dimensional complex functions 10 general high-dimensionalbenchmark functions (100D 500D 1000D) were selected foroptimization test Among them 1198911 sim 1198916 are unimodalfunctions which are used to test the local searchability ofthe algorithm 1198917 sim 11989110 are multimodal functions which areused to test the global searchability of the algorithm Thesetest functions have multiple local optima points with unevendistribution nonconvexity and strong oscillation which arevery difficult to converge to the global optimal solutionespecially in the case of being high-dimensional The specific

Complexity 9

Table 2 10 high-dimensional benchmark functions

Functions Function formula Range 119891119898119894119899 Accuracy

Sphere 1198911 (119909) = 119889sum119894=1

1199092119894 [minus100 100] 0 1 times 10minus8Schwefelrsquos 222 1198912 (119909) = 119889sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816 + 119889prod119894=1

10038161003816100381610038161199091198941003816100381610038161003816 [minus10 10] 0 1 times 10minus8Schwefelrsquos 221 1198913 (119909) = max11989410038161003816100381610038161199091198941003816100381610038161003816 1 le 119894 le 119889 [minus100 100] 0 1 times 10minus2Rosenbrock 1198914 (119909) = 119889minus1sum

119894=1

[100 (119909119894+1 minus 1199092119894 )2 + (119909119894 minus 1)2] [minus30 30] 0 1 times 100Schwefelrsquos 12 1198915 (119909) = 119889sum

119894=1

( 119894sum119895=1

119909119895)2 [minus100 100] 0 1 times 10minus3Quartic 1198916 (119909) = 119889sum

119894=1

1198941199094119894 + 119903119886119899119889119900119898[0 1) [minus128 128] 0 1 times 10minus4Rastrigin 1198917 (119909) = 119889sum

119894=1

[1199092119894 minus 10 cos (2120587119909119894) + 10] [minus512 512] 0 1 times 10minus8Ackley 1198918 = minus20 exp(minus02radic 1119889

119889sum119894=1

1199092119894) minus exp(1119889119889sum119894=1

cos (2120587119909119894)) + 20 + 119890 [minus32 32] 0 1 times 10minus5Griewank 1198919 = 14000

119889sum119894=1

1199092119894 minus 119889prod119894=1

cos( 119909119894radic119894) + 1 [minus600 600] 0 1 times 10minus5

Penalized 1

11989110 = 120587119889 10 sin (1205871199101) +119889minus1sum119894=1

(119910119894 minus 1)2 [1 + sin2 (120587119910119894+1)] + (119910119899 minus 1)2

[minus50 50] 0 1 times 10minus2+ 119889sum119894=1

119906 (119909119894 10 100 4)119910119894 = 1 + 119909119894 + 14

119906(119909119894 119886 119896119898)

119896(119909119894 minus 119886)119898 119909119894 gt 1198860119896(minus119909119894 minus 119886)119898 119909119894 lt minus119886

characteristics of the 10 benchmark functions are shown inTable 2

In order to show the optimization effect of the algorithmmore clearly this paper selects two indicators to evaluate theperformance of the algorithm [69] The first is the solutionaccuracy (AC) which reflects the difference between theoptimal results of the algorithm and the theoretical optimalvalue In an experiment if the final convergence result ofthe algorithm is less than the AC then the optimization isconsidered successful Assuming that the optimal value of acertain optimization is119883119900119901119905 and the theoretical optimal valueis119883119898119894119899 then the AC is calculated as follows

119860119862 = 10038161003816100381610038161003816119883119900119901119905 minus 11988311989811989411989910038161003816100381610038161003816 (20)

The second is the successful ratio (SR) which reflectsthe proportion of the number of successful optimizationsto the total number of optimizations under the conditionthat the algorithm has a certain accuracy Assume that inthe 119879 iterations of the algorithm the number of successfuloptimizations is 119905 then the SR is calculated as follows

119878119877 = 119905119879 times 100 (21)

52 Results on Classical Benchmark Functions In this sec-tion a set of simulation tests were performed in order toverify the effectiveness of HGGWA Firstly the 10 high-dimensional benchmark functions in Table 2 are optimizedbyHGGWAalgorithm proposed in this paper and the resultsare compared with WOA SSA and ALO In addition therunning time of several algorithms is compared under thegiven convergence accuracy

521 Parameters Settings for the Compared Algorithms In allexperiments the values of the common parameters such asthe maximum number of iterations (119872119886119909119894119905119890119903) the dimensionof the functions (D) and the population sizes (N) were cho-sen the same For all test problems we focus on investigatingthe optimization performance of the proposed method onproblemswithD= 100 500 and 1000Themaximumnumberof iterations is set to 1000 and the population size is set to50 The parameter setting of WOA SSA and ALO is derivedfrom the original literature [70ndash72] The optimal parametersettings of the HGGWA algorithm proposed in this paper areshown in Table 3

For each experiment of an algorithm on a benchmarkfunction 30 independent runs are performed to obtain a

10 Complexity

Table 3 Parameter settings of the HGGWA

Parameters Definition ValueN Population size 50119875119888 Cross probability 08119875119898 Mutation probability 001119872119886119909119894119905119890119903 Maximum number of iterations 1000119896 Nonlinear adjustment coefficient 05

fair comparison among the different algorithms The optimalvalue the worst value the average value and the standarddeviation of each algorithm optimization are recorded andthen the optimization successful ratio (SR) is calculatedAll programs were coded in Matlab 2017b (Win64) andexecuted on a Lenovo computer with Intel (R) Core I5-6300HQ 8G ROM 230GHz under Windows 10 operatingsystem

522 Comparison with State-of-the-Art Metaheuristic Algo-rithms In order to verify the efficiency of the HGGWAalgorithm several state-of-the-art metaheuristic algorithmsrecently proposed are compared with their optimizationresults These algorithms are Whale Optimization Algorithm(WOA) Salp Swarm Algorithm (SSA) and Ant Lion Opti-mization Algorithm (ALO) The test was performed usingthe same 10 high-dimensional functions in Table 2 and theresults of the comparison are shown in Table 4

From Table 4 it can be known that the convergenceaccuracy and optimization successful ratio of the HGGWAalgorithm are significantly higher than the other three algo-rithms for most of the test functions In the case of 100dimensions HGGWA algorithm can converge to the globaloptimal solution for the other nine test functions at one timein addition to function 3 And the algorithm has a successfulratio of 100 for the nine functions under the condition ofcertain convergence accuracy The convergence result of theWOA algorithm is better than the other three algorithms forfunctions 1198911 and 1198912 and its robustness is excellent With theincrease of function dimension the convergence precisionof several algorithms decreases slightly but the optimizationresults of HGGWA algorithm are better than those of WOASSA and ALO In general the HGGWA algorithm exhibitsbetter optimization effects than the WOA SSA and ALOalgorithms in the 100- 500- and 1000-dimensional testfunctions which proves the effectiveness of the HGGWAalgorithm for solving high-dimensional complex functions

Generally speaking the HGGWA algorithm performsbetter than other algorithms in most test functions Inorder to compare the convergence performance of the fouralgorithms more clearly the convergence curves of the fouralgorithms on Sphere Schwefelrsquos 222 Rastrigin GriewankQuartic and Ackley functions with D=100 are shown inFigure 2 Sphere and Schwefelrsquos 222 represent the two mostbasic unimodal functions Rastrigin and Griewank representthe two most basic multimodal functions It is seen thatHGGWA has higher precision and faster convergence speedthan other algorithms

523 Comparative Analysis of Running Time To evaluate theactual runtime of the several compared algorithms includingSSAWOAGWO andHGGWA their average running times(in seconds s) from 30 runs on function 1198911 1198912 1198916 1198917 1198918 1198919and 11989110 are plotted in Figure 3 The convergence accuracy ofeach test function has been described in Table 2

Figure 3 shows the convergence behavior of differentalgorithms Each point on the plot was calculated by takingthe average of 30 independent runs As can be seen fromFigure 3 several algorithms show different effects under thesame calculation accuracy For the functions 1198911 1198912 1198917 11989181198919 the GWO algorithm has the fastest convergence rateas a result of the advantages of the GWO algorithm itselfHowever for the test functions 1198916 and 11989110 the runningtime of the HGGWA algorithm is the shortest Throughfurther analysis it is known that HGGWA algorithm adds alittle computing time compared to GWO algorithm becauseof several improved strategies such as selection crossoverand mutation However the convergence speed of HGGWAalgorithm is still better than SSA and WOA algorithmsOverall the convergence speed of the HGGWA algorithmperforms better in several algorithms compared Moreoverthe running time of different test functions is very small andit is kept between 1s and 3s which shows that the HGGWAalgorithm has excellent stability

53 Results on CECrsquo2008 Benchmark Functions This sectionprovides an analysis of the effectiveness of HGGWA in termsof the CECrsquo2008 large-scale benchmark functions and acomparison with four powerful algorithms which have beenproven to be effective in solving large-scale optimizationproblems Experimental results are provided to analyze theperformance of HGGWA for large-scale optimization prob-lems

We tested our proposed algorithms with CECrsquo2008 func-tions for 100 500 and 1000 dimensions and the means ofthe best fitness value over 50 runs were recorded WhileF1 (Shifted Sphere) F4 (Shifted Rastrigin) and F6 (ShiftedAckley) are separable functions F2 (Schwefel Problem) F3(Shifted Rosenbrock) F5 (Shifted Griewank) and F7 (FastFractal) are nonseparable presenting a greater challenge toany algorithm that is prone to variable interactions Theperformance of HGGWA is compared with other algorithmsCCPSO2 [31] DEwSAcc [5] MLCC [38] and EPUS-PSO[49] for 100 500 and 1000 dimensions respectively Themaximum number of fitness evaluations (FEs) was calculatedby the following formula FEs = 5000timesD where D is thenumber of dimensions In order to reflect the fairness of thecomparison results the optimization results of the other fouralgorithms are directly derived from the original literatureThe specific comparison results are shown in Table 5

Table 5 shows the experimental results and the entriesshown in bold are significantly better than other algorithmsA general trend that can be seen is that HGGWA algorithmhas a promising optimization effect for most of the 7 func-tions

Generally speaking HGGWAoutperforms CCPSO2 on 6out of 7 functions with 100 and 500 dimensions respectivelyAmong them the result of HGGWA is slightly worse than

Complexity 11

Table4Com

paris

onresults

of10

high

-dim

ensio

nalb

enchmarkfunctio

nsby

WOASSA

ALO

and

HGGWA

Functio

nDim

WOA

SSA

ALO

HGGWA

Mean

Std

SRMean

Std

SRMean

Std

SRMean

Std

SR

119891 1100

577

E-96

469

E-96

100

101E-02

124E

-02

0304

E-01

231E-01

0485E-53

241E-52

100

500

724E

-93

298

E-94

100

324E+

04218E+

040

734E

+04

132E

+04

0835E-25

886E-25

100

1000

205

E-89

316E-90

100

119E

+05

104E

+05

0248E+

05212E+

050

175E

-17

273E-17

100

119891 2100

189E

-99

233E-99

100

885E+

00237E+

000

429E+

02325E+

020

401E-33

434E-33

100

500

138E

-99

216E-99

100

337E+

02217E+

020

231E+

03203E+

030

958E

-18

236E-18

100

1000

180E

-99

137E

-98

100

845E+

02343E+

020

325E+

04413E+

040

250E-11

166E

-11

100

119891 3100

384E+

01217E+

000

222E+

0110

5E+0

10

247E+

0113

4E+0

10

455E+

01369E+

010

500

932E

+01

532E+

010

327E+

01233E+

010

392E+

01234E+

010

722E

+01

501E+

000

1000

986E

+01

324E+

010

331E+

0112

8E+0

10

493E+

01399E+

010

831E+

01436E+

000

119891 4100

967E

+01

398E+

010

900E

+02

516E+

020

140E

+03

103E

+03

0682

E+01

125E

-01

100

500

495E+

02314E+

010

718E

+06

146E

+06

0209E+

0713

7E+0

70

367

E+02

823E-01

01000

991E+0

2248E+

020

298E+

0713

5E+0

70

152E

+08

100E

+07

0964

E+02

374E+

010

119891 5100

783E

+05

236E+

050

217E+

0410

3E+0

40

360

E+04

216E+

040

345

E-05

277

E-05

100

500

198E

+07

132E

+07

0429E+

05316E+

050

118E

+06

103E

+06

0267

E-03

438

E-03

80

1000

668E+

07344

E+07

0248E+

0612

3E+0

60

372E+

06349E+

060

833E+

00512E+

000

119891 6100

265E-04

143E

-04

100

112E

+00

103E

+00

0828E-01

268E-01

0234

E-09

317E-10

100

500

394E-04

322E-04

60

497E+

01366

E+01

0113E

+02

101E+0

20

573E-05

433E-06

100

1000

216E-03

136E

-03

0393E+

02278E+

020

177E

+03

233E+

030

108E

-04

162E

-03

80

119891 7100

000

E+00

000

E+00

100

627E+

01436E+

010

354E+

02332E+

020

000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

204

E+03

197E

+03

0334E+

03456E+

030

000

E+00

000

E+00

100

1000

000

E+00

000

E+00

100

569E+

0316

6E+0

30

719E

+03

214E+

030

557E-11

315E-12

100

119891 8100

444

E-15

312E-15

100

515E+

00413E+

000

493E+

00349E+

000

215E-15

348

E-15

100

500

444

E-15

305

E-15

100

126E

+01

100E

+01

012

8E+0

1228E+

010

874E-12

539E-12

100

1000

888

E-16

288

E-15

100

128E

+01

144E

+01

014

6E+0

1203E+

010

159E

-07

163E

-07

100

119891 9100

000

E+00

000

E+00

100

152E

-01

244

E-01

0454E-01

316E-01

0000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

282E+

02213E+

020

392E+

0210

8E+0

20

184E

-16

844

E-17

100

1000

000

E+00

000

E+00

100

110E

+03

134E

+03

0361E+

03117E

+03

0231E-13

329E-14

100

119891 10100

816E-03

432E-03

100

110E

+01

219E+

010

174E

+01

133E

+01

0273E-07

311E-07

100

500

252E-02

199E

-02

80

516E+

01333E+

010

397E+

05296E+

050

812E-05

498

E-06

100

1000

143E

-02

103E

-02

80

100E

+05

213E+

040

696E+

0613

4E+0

60

633E-04

377

E-03

100

12 Complexity

WOASSAALOHGGWA

500 10000Number of iterations

10minus100

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(a) Sphere function

WOASSAALOHGGWA

500 10000Number of iterations

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(b) Schwefel 222 function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(c) Rastrigin function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(d) Griewank function

WOASSAALOHGGWA

200 400 600 800 10000Number of iterations

10minus5

100

105

Obj

ectiv

e fun

ctio

n va

lue

(e) Quartic function

WOASSAALOHGGWA

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

500 10000Number of iterations

(f) Ackley function

Figure 2 Convergence curves of WOA SSA ALO and HGGWA algorithms

Comparative analysis of running time

459

344

053

13

388

239

072

186

537

366

317

285

423

233

122

216

399

217

057

139

259

218

069

121

463

333352

195

0

1

2

3

4

5

6

Runn

ing

Tim

es (s

)

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

SSAWOAGWOHGGWA(a) Comparative analysis of running time

Comparative analysis of iterations

302

233

198221

334290

263237

723

614

491

407 398

258

365312 263

234201

175

214185

166149

186

85

298

132

SSAWOAGWOHGGWA

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

100

200

300

400

500

600

700

Num

ber o

f ite

ratio

ns

(b) Comparative analysis of iterations

Figure 3 The average running times and iterations of SSA WOA GWO and HGGWA on different test functions

Complexity 13

Table5Com

paris

onresults

of7CE

Crsquo200

8benchm

arkfunctio

nsby

fivea

lgorith

ms

Functio

nDim

CCP

SO2

DEw

SAcc

MLC

CEP

US-PS

OHGGWA

Mean

Std

Mean

Std

Mean

Std

Mean

Std

Mean

Std

F1100

773E

-14

323E-14

568E-14

000

E+00

682E-14

232E-14

747E

-01

170E

-01

306

E-17

293E-17

500

300

E-13

796E

-14

209E-09

461E-09

429E-13

331E-14

845E+

01640

E+00

275E-15

213E-15

1000

518E-13

961E-14

878E-03

527E-03

845E-13

500

E-14

553E+

02286E+

0110

6E-10

183E

-11

F2100

608E+

0078

3E+0

0825E+

00532E+

00252E+

01872E+

0018

6E+0

1226E+

00271E+

00239

E+00

500

579E+

01421E+

0175

7E+0

1304

E+00

666

E+01

569E+

00435E+

01551E-01

312E+

01491E+

001000

782E

+01

425E+

0196

0E+0

118

1E+0

010

8E+0

2475E+

00466

E+01

400

E-01

271E+

01566

E+00

F3100

423E+

02865E+

0214

5E+0

2584E+

0114

9E+0

2572E+

01499E+

03535E+

03981E+

01225E-01

500

724E

+02

154E

+02

181E+0

3274E+

0292

4E+0

217

2E+0

2577E+

04804

E+03

496

E+02

623E-01

1000

133E

+03

263E+

02914E

+03

126E

+03

179E

+03

158E

+02

837E+

0515

2E+0

5997

E+02

358

E+01

F4100

398E-02

199E

-01

437E+

0076

5E+0

0438E-13

921E-14

471E+

02594E+

01000

E+00

000

E+00

500

398E-02

199E

-01

364

E+02

523E+

0117

9E-11

631E-11

349E+

03112E

+02

000

E+00

000

E+00

1000

199E

-01

406

E-01

182E

+03

137E

+02

137E

-10

337E-10

758E

+03

151E+0

2493E-11

287

E-12

F5100

345E-03

488E-03

307E-14

786E

-15

341E-14

116E

-14

372E-01

560

E-02

000

E+00

000

E+00

500

118E

-03

461E-03

690E-04

241E-03

212E-13

247E-14

164E

+00

469E-02

216E-16

756E

-17

1000

118E

-03

327E-03

358E-03

573E-03

418E-13

278

E-14

589E+

00391E-01

549E-13

235E-14

F6100

144E

-13

306

E-14

113E

-13

153E

-14

111E-13

786E

-15

206

E+00

440

E-01

321E-15

248

E-15

500

534E-13

861E-14

480E-01

573E-01

534

E-13

701E-14

664

E+00

449E-01

674E-12

439E-12

1000

102E

-12

168E

-13

229E+

00298E-01

106E

-12

768E

-14

189E

+01

249E+

00239E-07

213E-07

F7100

-150E

+03

104E

+01

-137

E+03

246

E+01

-154E

+03

252E+

00-855E

+02

135E

+01

-700e+0

2664

E-01

500

-723E

+03

416E+

01-574

E+03

183E

+02

-743E

+03

803E+

00-351E+0

3210E+

01-883e+0

3354

E+00

1000

-143E

+04

827E+

01-105E

+04

418E+

02-147E

+04

151E+0

1-662E

+03

318E+

01-369E

+04

216E+

02

14 Complexity

CCPSO2 for function F7 in the case of 100-dimensionalthe two algorithms obtain similar optimization results forthe function F6 in the case of 500-dimensional In additionat first glance it might seem that HGGWA and MLCChave similar performance However the HGGWA achievesbetter convergence accuracy than MLCC under the samenumber of iterations In particular the HGGWA algorithmrsquosoptimization results are superior toDEwSAcc andEPUS-PSOin all dimensions The result of HGGWA scaled very wellfrom 100-D to 1000-D on F1 F4 F5 and F6 outperformingDEwSAcc and EPUS-PSO on all the cases

54 Sensitivity Analysis of Nonlinear Adjustment Coefficient 119896This section mainly discusses the effect of nonlinear adjust-ment coefficients on the convergence performance of thealgorithm In the HGGWA algorithm the role of parameter119886 is to balance global exploration capabilities and localexploitation capabilities In (12) the nonlinear adjustmentcoefficient 119896 is a key parameter and is mainly used to controlthe range of variation of the convergence factor Thereforewe selected four different values for numerical experimentsto analyze the effect of nonlinear adjustment coefficients onthe performance of the algorithm The four different valuesare 051152 and the results of the comparison are shown inTable 6 Among them black bold represents the best result ofsolving in several algorithms

In general the value of the nonlinear adjustment coef-ficient 119896 has no significant effect on the performance ofthe HGGWA algorithm On closer inspection one can seethat besides the function 1198913 the optimization performanceof the algorithm is optimal when the nonlinear adjustmentcoefficient 119896 is 05 But for function 1198913 119896 = 15 is the bestchoice For functions 1198917 and 1198919 the HGGWA algorithmobtains the optimal solution 0 in the case of several differentvalues of the coefficient 119896 Therefore for most test functionsthe optimal value of the nonlinear adjustment factor 119896 is 0555 Global Convergence Analysis of HGGWA In theHGGWA algorithm the following four improved strategiesare used to ensure the global convergence of the algorithm(1) initial population strategy based on Opposition-BasedLearning (2) selection operation based on optimal retention(3) crossover operation based on population partitioningmechanism and (4) mutation operation for elite individ-uals

The idea of solving the high-dimensional function opti-mization problem byHGGWAalgorithm is as follows Firstlyfor the initial population strategy the basic GWO generatesthe initial population in a random manner However it willhave a great impact on the search efficiency of the algo-rithm for high-dimensional function optimization problemsThe algorithm cannot effectively search the entire problemsolution space if the initial grey wolf population is clusteredin a small range Therefore the Opposition-Based Learningstrategy is used to initialize the population which can makethe initial population evenly distributed in the solution spacethus laying a good foundation for the global search of thealgorithm Secondly the optimal retention strategy is usedto preserve the optimal individual of the parent population

in the process of population evolution As a result the nextgeneration of population can evolve in the optimal directionIn addition the individuals with higher fitness are selectedby the gambling roulette operation which maintains thedominance relationship between individuals in the popula-tion This dominance relationship makes the algorithm withgood global convergence Thirdly the crossover operation iscompleted under the condition of population partitioningin order to achieve the purpose of dimension reductionand maintain the diversity of the population Finally thesearch direction of the grey wolf population is guided by themutation operation of the elite individuals which effectivelyprevents the algorithm from falling into the local optimalsolution All in all through the improvement of these fourdifferent strategies the HGGWA algorithm shows goodglobal convergence in solving high-dimensional functionoptimization problems

6 Conclusion

In order to overcome the shortcomings of GWO algorithmfor solving large-scale global optimization problems whichare easy to fall into local optimum this paper proposes aHybrid Genetic GreyWolf Algorithm (HGGWA) by integrat-ing three genetic operators into the algorithm The improvedalgorithm initializes the population based on theOpposition-Based Learning strategy for improving the search efficiencyof the algorithm The strategy of population division basedon cooperative coevolution reduces the scale of the problemand the mutation operation of the elite individuals effectivelyprevents the algorithm from falling into the local optimaTheperformance of HGGWA has been evaluated using 10 classi-cal benchmark functions and 7 CECrsquo2008 high-dimensionalfunctions Fromour experimental results several conclusionscan be drawn

The results have shown that it is capable of groupinginteracting variables with great accuracy for the major-ity of the benchmark functions A comparative studyamong the HGGWA and other state-of-the-art algorithmswas conducted and the experimental results showed thatthe HGGWA algorithm exhibits better global convergencewhether it is solving separable functions or nonseparablefunctions(1)By testing 10 classical benchmark functions comparedwith the results of WOA SSA and ALO the HGGWA algo-rithm has been greatly improved in convergence accuracy Inaddition to the Schwefel problem and Rosenbrock functionthe HGGWA algorithm achieves an 80-100 successfulratio for all benchmark functions with 100 dimensionswhich proves the effectiveness of HGGWA for solving high-dimensional functions(2) By comparing with the optimization results on sevenCECrsquo2008 large-scale global optimization problems theresults show that HGGWA algorithm achieves the globaloptimal value for the separable functions F1 F4 and F6but the effect on other nonseparable problems is not verysatisfactory However HGGWAalgorithm still exhibits betterglobal convergence than the other algorithms CCPSO2DewSAcc MLCC and EPUS-PSO on most of the functions

Complexity 15

Table6Com

paris

onof

optim

izationperfo

rmance

ofdifferent119896val

uesfor

HGGWA

Functio

n119896=0

5119896=1

119896=15

119896=2Mean

Std

Mean

Std

Mean

Std

Mean

Std

119891 1843

E-60

536E-60

225E-54

203E-54

558E-53

431E-53

231E-51

316E-51

119891 2289

E-34

231E-34

206

E-33

189E

-33

470E-32

365E-32

576E-30

422E-30

119891 3111E+0

010

2E+0

0314E+

01219E+

01403E-01

238E-01

360

E+01

216E+

01119891 4

978

E+01

836E+

0198

2E+0

174

3E+0

198

1E+0

173

4E+0

198

3E+0

1461E+

01119891 5

218E-03

134E

-03

887E-02

667E-02

781E+0

0613E+

00174E

-02

234E-02

119891 613

1E-03

113E

-03

162E

-03

243E-04

299E-03

104E

-03

184E

-03

134E

-03

119891 7000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 815

8E-14

123E

-15

293E-14

137E

-14

222E-14

149E

-14

251E-14

108E

-14

119891 9000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 10299

E-02

249E-02

338E-02

124E

-03

842E-02

614E-02

729E

-02

813E-03

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 5: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

Complexity 5

GWO is a powerful nature-inspired optimization algorithmfor large-scale problems except Rosenbrock function

In summary the decomposition-based CC strategy has agood effect in solving the nonseparable problem but it cannotsolve the imbalance problem well and it still needs to beimproved in the accuracy of the solution In addition in themethod based on nondecomposition strategy although theperformance of many algorithms has been greatly improvedbut the research on hybrid algorithms is still rare Thereforethis paper combines three genetic operators of selectioncrossover and mutation with GWO algorithm and proposesa Hybrid Genetic Grey Wolf Algorithm (HGGWA) Themain improvement strategies of the algorithm are as fol-lows (1) initialize the population based on the Opposition-Based Learning strategy (2) the nonlinear adjustment ofparameters effectively balances exploration and exploitationcapabilities while accelerating the convergence speed of thealgorithm (3) select the population through a combinationof elite retention strategy and gambling roulette (4) basedon the method of dimensionality reduction and populationdivision the whole population is divided into multiplesubpopulations for cross-operation to increase the diversityof the population (5) perform mutation operations on eliteindividuals in the population to prevent the algorithm fromfalling into local optimum

3 Grey Wolf Optimization Algorithm

31 Mathematical Models In the GWO algorithm [12] theposition of 120572 wolf 120573 wolf and 120575 wolf is the fittest solutionthe second-best solution and the third-best solution respec-tively And the positions of the 120596 wolves are the remainingcandidate solutions In the whole process of searching forprey in the space the 120596 wolves gradually update theirpositions according to the optimal position of the 120572 120573 and 120575wolves So they can gradually approach and capture the preythat is to complete the process of searching for the optimalsolution The position updating imaginary diagram in GWOis shown in Figure 1

The grey wolves position update formula in this processis as follows 997888rarr119863 = 100381610038161003816100381610038161003816997888rarr119862 lowast 997888997888rarr119883119901 (119905) minus 997888rarr119883 (119905)100381610038161003816100381610038161003816 (1)

997888rarr119883 (119905 + 1) = 997888997888rarr119883119901 (119905) minus 997888rarr119860 lowast 997888rarr119863 (2)

where 997888rarr119863 indicates the distance between the grey wolfindividual and the prey 119905 indicates the current iteration 997888rarr119860and 997888rarr119862 are the coefficient vectors 997888rarr119860 is a convergence coef-ficient used to balance the exploration and the exploitation997888rarr119862 is used to simulate the effects of nature 997888997888rarr119883119901 is the positionvector of the prey and 997888rarr119883 indicates the position vector of agrey wolf

The coefficient vectors 997888rarr119860 and 997888rarr119862 are calculated as follows997888rarr119860 = 2997888rarr119886 lowast 997888rarr1199031 minus 997888rarr119886 (3)

997888rarr119862 = 2 lowast 997888rarr1199032 (4)

where components of 997888rarr119886 are linearly decreased from 2to 0 over the course of iterations and 1199031 and 1199032 are randomvectors in [0 1] At the same time the fluctuation range ofthe coefficient vector 997888rarr119860 gradually decreases as 997888rarr119886 decreases

Thus according to (1) and (2) the grey wolf individ-ual randomly moves within the search space to graduallyapproach the optimal solution The specific location updateformula is as follows 997888997888rarr119863120572 = 100381610038161003816100381610038161003816997888rarr1198621 lowast 997888997888rarr119883120572 minus 997888rarr119883100381610038161003816100381610038161003816 (5)

997888997888rarr119863120573 = 1003816100381610038161003816100381610038161003816997888rarr1198622 lowast 997888997888rarr119883120573 minus 997888rarr1198831003816100381610038161003816100381610038161003816 (6)

997888997888rarr119863120575 = 1003816100381610038161003816100381610038161003816997888rarr1198623 lowast 997888997888rarr119883120575 minus 997888rarr1198831003816100381610038161003816100381610038161003816 (7)

997888997888rarr1198831 = 997888997888rarr119883120572 minus 997888rarr1198601 lowast 997888997888rarr119863120572 (8)

997888997888rarr1198832 = 997888997888rarr119883120573 minus 997888rarr1198602 lowast 997888997888rarr119863120573 (9)

997888997888rarr1198833 = 997888997888rarr119883120575 minus 997888rarr1198603 lowast 997888997888rarr119863120575 (10)

997888rarr119883 (119905 + 1) = 997888997888rarr1198831 + 997888997888rarr1198832 + 997888997888rarr11988333 (11)

where997888997888rarr119863120572 997888997888rarr119863120573 997888997888rarr119863120575 indicate the distance between the 120572 120573120575 wolves and the prey respectively 997888997888rarr1198831 997888997888rarr1198832 997888997888rarr1198833 represent thepositional components of the remaining grey wolves basedon the positions of 120572 120573 and 120575 wolves respectively 997888rarr119883(119905 + 1)represents the position vector after the grey wolf is updated

32 Basic Process of GWO

Step 1 Randomly generate an initial population and initial-ize parameters 997888rarr119886 997888rarr119860 and 997888rarr119862 Step 2 Calculate the fitness of the grey wolf individual andsave the positions of the first three individuals with thehighest fitness as 997888rarr119883120572 997888rarr119883120573 and 997888rarr119883120575Step 3 Update the position information of each grey wolfaccording to (5) to (11) thereby obtaining the next generationpopulation and then update the values of parameters 997888rarr119886 997888rarr119860and 997888rarr119862 Step 4 Calculate the fitness of individuals in the new popu-lation and update 997888rarr119883120572 997888rarr119883120573 and 997888rarr119883120575Step 5 Repeat Steps 2ndash4 until the optimal solution isobtained or the maximum number of iterations is reached

4 Hybrid Genetic Grey Wolf Algorithm

This section describes the details of HGGWA the hybridalgorithm proposed in this paper Firstly the initial pop-ulation strategy based on Opposition-Based Learning is

6 Complexity

D

D

D

Move

C1 a1

C2 a2

a3C3

R

δ

or other wolves

The estimated position of the prey

Figure 1 Position updating in GWO

described in detail in Section 41 Secondly Section 42 intro-duces the nonlinear adjustment strategy for parameter 119886Then three operations of selection crossover and mutationare carried out in Sections 43ndash45 respectively Finally thetime complexity of HGGWA is analyzed in Section 46

41 Initial Population Strategy Based on Opposition-BasedLearning For the swarm intelligence optimization algo-rithm based on population iteration the diversity of initialpopulations lays the foundation for the efficiency of thealgorithm The better diversity of the population will reducethe computation time and improve the global convergence ofthe algorithm [64] However like other algorithms the GWOalgorithm also uses random initialization when generatingpopulations which will have a certain impact on the searchefficiency of the algorithm Opposition-Based Learning is astrategy proposed by Tizhoosh [65] to improve the efficiencyof algorithm search It uses the opposite points of known indi-vidual positions to generate new individual positions therebyincreasing the diversity of search populations Currently theOpposition-Based Learning strategy has been successfullyapplied to multiple swarm optimization algorithms [66 67]The opposite point is defined as follows

Assume that there is an individual 119883 (1199091 1199092 119909119863) inthe population P then the opposite point of the element119909119894 (119894 = 1 2 119863) on each dimension is 1199091015840119894 = 119897119894 + 119906119894 minus119909119894 where 119897119894 and 119906119894 are the lower bound and the upperof the 119894th dimension respectively According to the abovedefinition the Opposition-Based Learning strategy initializesthe population as follows

(a) Randomly initialize the population P calculate theopposite point 1199091015840119894 of each individual position 119909119894 and all theopposite points constitute the opposition population P1015840

(b) Calculate the fitness of each individual in the initialpopulation P and the opposition population P1015840 and arrangethem in descending order

(c) Select the top N grey wolf individuals with the highestfitness as the final initial population

42 Parameters Nonlinear Adjustment Strategy It is sig-nificantly important how to balance the exploration andexploitation for swarm intelligence algorithms In the earlyiteration of the algorithm the powerful exploration capabilityis beneficial for the algorithm to expand the search rangeso that the algorithm can search for the optimal solutionwith greater probability In the later stage of the algorithmrsquositeration the effective exploitation ability can speed up theoptimization process of the algorithm and improve the con-vergence accuracy of the final solution Therefore only whenthe swarm optimization algorithm can better coordinate itsglobal exploration and local exploitation capabilities can ithave strong robustness and faster convergence speed

It can be seen from (2) and (3) that the dynamic changeof the parameter 119886 plays a crucial role in the algorithm Inthe basic GWO the linear decrement strategy of parameter119886 does not reflect the actual convergence process of the algo-rithm Therefore the algorithm does not balance the explo-ration and exploitation capabilities well especially when solv-ing large-scale multimodal function problems In order todynamically adjust the global exploration and local exploita-tion process of the algorithm this paper proposes a nonlinearadjustment strategy for the parameter a which is beneficialforHGGWA to search for the optimal solutionThe improvedparameter nonlinear adjustment equation is as follows

119886 = 119886119898119886119909 + (119886119898119894119899 minus 119886119898119886119909) sdot ( 11 + 119890119905119872119886119909119894119905119890119903 )119896

(12)

where 119886119898119886119909and 119886119898119894119899 are the initial value and the end valueof the parameter 119886 119905 is the current iteration index 119872119886119909119894119905119890119903is the maximum number of iterations and 119896 is nonlinearadjustment coefficient

Thus the variation range of the convergence factor Ais controlled by the nonlinear adjustment of the parameter119886 When |A| gt 1 the grey wolf population expandsthe search range to find better prey which corresponds tothe global exploration of the algorithm when |A| lt 1

Complexity 7

the whole population shrinks the search range thus anencirclement is formed around the prey to complete the finalattacking against the prey which corresponds to the localexploitation process of the algorithm The whole process hasa positive effect on balancing global search and local searchwhich is beneficial to improve the accuracy of the solutionand further accelerate the convergence speed of the algo-rithm

43 Selection Operation Based on Optimal Retention Forthe intelligent optimization algorithm based on populationevolution the evolution of each generation of populationdirectly affects the optimization effect of the algorithm Inorder to inherit the superior individuals in the paternalpopulation to the next generation without being destroyedit is necessary to preserve the good individuals directly Theoptimal retention selection strategy is an effective way topreserve good individuals in genetic algorithms This paperintegrates them into the GWO to improve the efficiencyof the algorithm Assuming that the current population isP(1198831 1198832 119883119863) the fitness of the individual 119883119894 is 119865119894 Thespecific operation is as follows

(a) Calculate the fitness 119865119894 (i = 1 2 D) of eachindividual and arrange them in descending order

(b) Selecting the individuals with the highest fitnessto copy directly into the next generation of popula-tion

(c) Calculate the total fitness S of the remaining indi-viduals and the probability 119901119894 that each individual isselected

S = 119863minus1sum119894=1

119865119894 (119894 = 1 2 119863 minus 1) (13)

119901119894 = 119865119894sum119863minus1119894=1 119865119894 (119894 = 1 2 119863 minus 1) (14)

(d) Calculate the cumulative fitness value 119904119894 for eachindividual and then the selection operation is performed inthemanner of the bet roulette until the number of individualsin the children population is consistent with the parentpopulation

119904119894 = sum119894119894=1 119865119894S(119894 = 1 2 119863 minus 1) (15)

44 Crossover Operation Based on Population Partition-ing Mechanism Due to the decrease of population diver-sity in the late evolution the GWO algorithm is easy tofall into local optimum for solving the large-scale high-dimensional optimization problem in order to overcomethe problems caused by large-scale and complexity andto ensure that the algorithm searches for all solutionspaces

In the HGGWA algorithm the whole population P isdivided into 120592 times 120578 subpopulations 119875119894119895 (119894 = 1 sdot sdot sdot 120592 119895 =1 120578) The optimal size of each subpopulation 119875119894119895 wastested to be 5 times 5 Individuals of each subpopulation were

cross-operated to increase the diversity of the populationThespecific division method is shown below

The initial population 119875[[[[[[[

11990911 11990912 sdot sdot sdot 119909111988911990921 11990922 sdot sdot sdot 1199092119889 d1199091198751 1199091198752 sdot sdot sdot 119909119875119889

]]]]]]]Subpopulations 119875119894119895

[[[[[

11990911 sdot sdot sdot 11990915 d11990951 sdot sdot sdot 11990955]]]]]sdot sdot sdot [[[[[

1199091(119889minus4) sdot sdot sdot 1199091119889 d1199095(119889minus4) sdot sdot sdot 1199095119889]]]]]11987511 sdot sdot sdot 1198751120578

[[[[[

119909(119875minus4)1 sdot sdot sdot 119909(119875minus4)5 d1199091198751 sdot sdot sdot 1199091198755

]]]]]sdot sdot sdot [[[[[

119909(119875minus4)(119889minus4) sdot sdot sdot 119909(119875minus4)119889 d119909119875(119889minus4) sdot sdot sdot 119909119875119889

]]]]]119875]1 sdot sdot sdot 119875]120578

(16)

In genetic algorithms cross-operation has a very impor-tant role and is the main way to generate new individuals Inthe improved algorithm of this paper each individual in thesubpopulation is cross-operated in a linear crossover mannerGenerating a corresponding random number 119903119894 isin(01) foreach individual 119909119894 in the subpopulation When the randomnumber 119903119894 is less than the crossover probability 119875119888 thecorresponding individual 119909119894 is paired for cross-operation

An example is as follows Crossover (p1 p2)(a) Generate a random number 120582 isin (0 1)(b) Two children 1198881(11988811 1198881119863) 1198882(11988821 1198882119863) are gener-

ated by two parents 1199011(11990111 1199011119863) and 1199012(11990121 1199012119863)1198881119894 = 1205821199011119894 + (1 minus 120582) 1199012119894 119894 = 1 2 119863 (17)

1198882119894 = 1205821199012119894 + (1 minus 120582) 1199011119894 119894 = 1 2 119863 (18)

45 Mutation Operation for Elite Individuals Due to theexistence of the selection operation based on the optimalpreservation strategy the grey wolf individuals in the wholepopulation are concentrated in a small optimal region inthe later stage of the iterative process which easily leadsto the loss of population diversity If the current optimalindividual is a locally optimal solution then the algorithm iseasy to fall into local optimum especially when solving high-dimensional multimodal functions To this end this paperintroduces the mutation operator in the HGGWA algorithmto perform mutation operations on elite individuals in thepopulation The specific operations are as follows

Assume that the optimal individual is 119909119894(1199091 1199092 119909119889)and the mutation operation is performed on 119909119894 with themutation probability 119875119898 That is select a gene 119909119896 from theoptimal individual with probability 119875119898 instead of the gene 119909119896with a random number between upper and lower bounds to

8 Complexity

(1) Initialize parameters a A C population size N 119875119888 and 119875119898(2) Initialize population 119875 using OBL the strategy(3) t=0(4) While tltMa119909119894119905119890119903(5) Calculate the fitness of all search agents(6) 119883120572= the best search agent(7) 119883120573= the second best search agent(8) 119883120575= the third best search agent(9) for i = 1N(10) Update the position of all search agents by equation (11)(11) end for(12) newP1larr997888P except the119883120572(13) for i = 1N-1(14) Generate newP2 by the Roulette Wheel Selection on newP1(15) end for(16) Plarr997888newP2 and119883120572(17) Generate subPs by the population partitioning mechanism on P(18) Select the individuals by crossover probability 119875119888(19) for i = 1Nlowast119875119888(20) Crossover each search agent by equation (17) and (18)(21) end for(22) Generate the1198831015840120572 1198831015840120573 and1198831015840120575 by equation (19) on mutation probability 119875119898(23) t = t+1(24) end while(25) Return 119883120572Algorithm 1 Pseudo code of the HGGWA

generate a new individual 1199091015840119894 = (11990910158401 11990910158402 1199091015840119889) The specificoperation is as follows

1199091015840119894 = 119897 + 120582 lowast (119906 minus 119897) 119894 = 119896119909119894 119894 = 119896 (19)

where 120582 is a random number in [0 1] and 119897 and 119906 are thelower and upper bounds of the individual 119909119894 respectively46 Pseudo Code of the HGGWA and Time ComplexityAnalysis This section describes how to calculate an upperbound for the total number of fitness evaluations (FE)required by HGGWA As shown in Algorithm 1 the compu-tational complexity of HGGWA in one generation is mainlydominated by the position updating of all search agents inline (10) The position updating requires a time complexityof O(Nlowastdimlowast119872119886119909119894119905119890119903)(N is the population size dim is thedimensions of each benchmark function and 119872119886119909119894119905119890119903 isthe maximum number of iterations of HGGWA) to obtainthe needed parameters and to finish the whole positionupdating progress of all search agents In addition theprocess of crossover operations is another time-consumingstep in line (20) It needs a time complexity of O(2lowast120592 lowast120578 lowast 119872119886119909119894119905119890119903)(120592 lowast 120578 is the total number of subpopulations120592=N5 120578=dim5) to complete the crossing of individuals ineach subpopulations according to the crossover probability119875119888 It should be noted that this is only the worst casecomputational complexity If there are only two individualsin each subpopulation that need to be crossed then theminimum amount of computation is O(120592 lowast 120578 lowast 119872119886119909119894119905119890119903)

Therefore the maximum computational complexity causedby the twodominant processes in the algorithm ismax 119874(Nlowast119889119894119898lowast119872119886119909119894119905119890119903) 119874(2lowast120592lowast120578lowast119872119886119909119894119905119890119903) (120592 = N5 120578 = 1198891198941198985)in one generation ie 119874(N lowast 119889119894119898 lowast119872119886119909119894119905119890119903)5 Numerical Experiments and Analysis

In this section the proposed HGGWA algorithm will beevaluated on both classical benchmark functions [68] andthe suite of test functions provided by CEC2008 SpecialSession on large-scale global optimization The algorithmsused for comparison include not only conventional EAs butalso other CC optimization algorithms Experimental resultsare provided to analyze the performance of HGGWA in thecontext of large-scale optimization problems In addition thesensitivity analysis of nonlinear adjustment coefficient andthe global convergence of HGGWA are discussed in Sections54 and 55 respectively

51 Benchmark Functions and Performance Measures Inorder to verify the ability ofHGGWAalgorithm to solve high-dimensional complex functions 10 general high-dimensionalbenchmark functions (100D 500D 1000D) were selected foroptimization test Among them 1198911 sim 1198916 are unimodalfunctions which are used to test the local searchability ofthe algorithm 1198917 sim 11989110 are multimodal functions which areused to test the global searchability of the algorithm Thesetest functions have multiple local optima points with unevendistribution nonconvexity and strong oscillation which arevery difficult to converge to the global optimal solutionespecially in the case of being high-dimensional The specific

Complexity 9

Table 2 10 high-dimensional benchmark functions

Functions Function formula Range 119891119898119894119899 Accuracy

Sphere 1198911 (119909) = 119889sum119894=1

1199092119894 [minus100 100] 0 1 times 10minus8Schwefelrsquos 222 1198912 (119909) = 119889sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816 + 119889prod119894=1

10038161003816100381610038161199091198941003816100381610038161003816 [minus10 10] 0 1 times 10minus8Schwefelrsquos 221 1198913 (119909) = max11989410038161003816100381610038161199091198941003816100381610038161003816 1 le 119894 le 119889 [minus100 100] 0 1 times 10minus2Rosenbrock 1198914 (119909) = 119889minus1sum

119894=1

[100 (119909119894+1 minus 1199092119894 )2 + (119909119894 minus 1)2] [minus30 30] 0 1 times 100Schwefelrsquos 12 1198915 (119909) = 119889sum

119894=1

( 119894sum119895=1

119909119895)2 [minus100 100] 0 1 times 10minus3Quartic 1198916 (119909) = 119889sum

119894=1

1198941199094119894 + 119903119886119899119889119900119898[0 1) [minus128 128] 0 1 times 10minus4Rastrigin 1198917 (119909) = 119889sum

119894=1

[1199092119894 minus 10 cos (2120587119909119894) + 10] [minus512 512] 0 1 times 10minus8Ackley 1198918 = minus20 exp(minus02radic 1119889

119889sum119894=1

1199092119894) minus exp(1119889119889sum119894=1

cos (2120587119909119894)) + 20 + 119890 [minus32 32] 0 1 times 10minus5Griewank 1198919 = 14000

119889sum119894=1

1199092119894 minus 119889prod119894=1

cos( 119909119894radic119894) + 1 [minus600 600] 0 1 times 10minus5

Penalized 1

11989110 = 120587119889 10 sin (1205871199101) +119889minus1sum119894=1

(119910119894 minus 1)2 [1 + sin2 (120587119910119894+1)] + (119910119899 minus 1)2

[minus50 50] 0 1 times 10minus2+ 119889sum119894=1

119906 (119909119894 10 100 4)119910119894 = 1 + 119909119894 + 14

119906(119909119894 119886 119896119898)

119896(119909119894 minus 119886)119898 119909119894 gt 1198860119896(minus119909119894 minus 119886)119898 119909119894 lt minus119886

characteristics of the 10 benchmark functions are shown inTable 2

In order to show the optimization effect of the algorithmmore clearly this paper selects two indicators to evaluate theperformance of the algorithm [69] The first is the solutionaccuracy (AC) which reflects the difference between theoptimal results of the algorithm and the theoretical optimalvalue In an experiment if the final convergence result ofthe algorithm is less than the AC then the optimization isconsidered successful Assuming that the optimal value of acertain optimization is119883119900119901119905 and the theoretical optimal valueis119883119898119894119899 then the AC is calculated as follows

119860119862 = 10038161003816100381610038161003816119883119900119901119905 minus 11988311989811989411989910038161003816100381610038161003816 (20)

The second is the successful ratio (SR) which reflectsthe proportion of the number of successful optimizationsto the total number of optimizations under the conditionthat the algorithm has a certain accuracy Assume that inthe 119879 iterations of the algorithm the number of successfuloptimizations is 119905 then the SR is calculated as follows

119878119877 = 119905119879 times 100 (21)

52 Results on Classical Benchmark Functions In this sec-tion a set of simulation tests were performed in order toverify the effectiveness of HGGWA Firstly the 10 high-dimensional benchmark functions in Table 2 are optimizedbyHGGWAalgorithm proposed in this paper and the resultsare compared with WOA SSA and ALO In addition therunning time of several algorithms is compared under thegiven convergence accuracy

521 Parameters Settings for the Compared Algorithms In allexperiments the values of the common parameters such asthe maximum number of iterations (119872119886119909119894119905119890119903) the dimensionof the functions (D) and the population sizes (N) were cho-sen the same For all test problems we focus on investigatingthe optimization performance of the proposed method onproblemswithD= 100 500 and 1000Themaximumnumberof iterations is set to 1000 and the population size is set to50 The parameter setting of WOA SSA and ALO is derivedfrom the original literature [70ndash72] The optimal parametersettings of the HGGWA algorithm proposed in this paper areshown in Table 3

For each experiment of an algorithm on a benchmarkfunction 30 independent runs are performed to obtain a

10 Complexity

Table 3 Parameter settings of the HGGWA

Parameters Definition ValueN Population size 50119875119888 Cross probability 08119875119898 Mutation probability 001119872119886119909119894119905119890119903 Maximum number of iterations 1000119896 Nonlinear adjustment coefficient 05

fair comparison among the different algorithms The optimalvalue the worst value the average value and the standarddeviation of each algorithm optimization are recorded andthen the optimization successful ratio (SR) is calculatedAll programs were coded in Matlab 2017b (Win64) andexecuted on a Lenovo computer with Intel (R) Core I5-6300HQ 8G ROM 230GHz under Windows 10 operatingsystem

522 Comparison with State-of-the-Art Metaheuristic Algo-rithms In order to verify the efficiency of the HGGWAalgorithm several state-of-the-art metaheuristic algorithmsrecently proposed are compared with their optimizationresults These algorithms are Whale Optimization Algorithm(WOA) Salp Swarm Algorithm (SSA) and Ant Lion Opti-mization Algorithm (ALO) The test was performed usingthe same 10 high-dimensional functions in Table 2 and theresults of the comparison are shown in Table 4

From Table 4 it can be known that the convergenceaccuracy and optimization successful ratio of the HGGWAalgorithm are significantly higher than the other three algo-rithms for most of the test functions In the case of 100dimensions HGGWA algorithm can converge to the globaloptimal solution for the other nine test functions at one timein addition to function 3 And the algorithm has a successfulratio of 100 for the nine functions under the condition ofcertain convergence accuracy The convergence result of theWOA algorithm is better than the other three algorithms forfunctions 1198911 and 1198912 and its robustness is excellent With theincrease of function dimension the convergence precisionof several algorithms decreases slightly but the optimizationresults of HGGWA algorithm are better than those of WOASSA and ALO In general the HGGWA algorithm exhibitsbetter optimization effects than the WOA SSA and ALOalgorithms in the 100- 500- and 1000-dimensional testfunctions which proves the effectiveness of the HGGWAalgorithm for solving high-dimensional complex functions

Generally speaking the HGGWA algorithm performsbetter than other algorithms in most test functions Inorder to compare the convergence performance of the fouralgorithms more clearly the convergence curves of the fouralgorithms on Sphere Schwefelrsquos 222 Rastrigin GriewankQuartic and Ackley functions with D=100 are shown inFigure 2 Sphere and Schwefelrsquos 222 represent the two mostbasic unimodal functions Rastrigin and Griewank representthe two most basic multimodal functions It is seen thatHGGWA has higher precision and faster convergence speedthan other algorithms

523 Comparative Analysis of Running Time To evaluate theactual runtime of the several compared algorithms includingSSAWOAGWO andHGGWA their average running times(in seconds s) from 30 runs on function 1198911 1198912 1198916 1198917 1198918 1198919and 11989110 are plotted in Figure 3 The convergence accuracy ofeach test function has been described in Table 2

Figure 3 shows the convergence behavior of differentalgorithms Each point on the plot was calculated by takingthe average of 30 independent runs As can be seen fromFigure 3 several algorithms show different effects under thesame calculation accuracy For the functions 1198911 1198912 1198917 11989181198919 the GWO algorithm has the fastest convergence rateas a result of the advantages of the GWO algorithm itselfHowever for the test functions 1198916 and 11989110 the runningtime of the HGGWA algorithm is the shortest Throughfurther analysis it is known that HGGWA algorithm adds alittle computing time compared to GWO algorithm becauseof several improved strategies such as selection crossoverand mutation However the convergence speed of HGGWAalgorithm is still better than SSA and WOA algorithmsOverall the convergence speed of the HGGWA algorithmperforms better in several algorithms compared Moreoverthe running time of different test functions is very small andit is kept between 1s and 3s which shows that the HGGWAalgorithm has excellent stability

53 Results on CECrsquo2008 Benchmark Functions This sectionprovides an analysis of the effectiveness of HGGWA in termsof the CECrsquo2008 large-scale benchmark functions and acomparison with four powerful algorithms which have beenproven to be effective in solving large-scale optimizationproblems Experimental results are provided to analyze theperformance of HGGWA for large-scale optimization prob-lems

We tested our proposed algorithms with CECrsquo2008 func-tions for 100 500 and 1000 dimensions and the means ofthe best fitness value over 50 runs were recorded WhileF1 (Shifted Sphere) F4 (Shifted Rastrigin) and F6 (ShiftedAckley) are separable functions F2 (Schwefel Problem) F3(Shifted Rosenbrock) F5 (Shifted Griewank) and F7 (FastFractal) are nonseparable presenting a greater challenge toany algorithm that is prone to variable interactions Theperformance of HGGWA is compared with other algorithmsCCPSO2 [31] DEwSAcc [5] MLCC [38] and EPUS-PSO[49] for 100 500 and 1000 dimensions respectively Themaximum number of fitness evaluations (FEs) was calculatedby the following formula FEs = 5000timesD where D is thenumber of dimensions In order to reflect the fairness of thecomparison results the optimization results of the other fouralgorithms are directly derived from the original literatureThe specific comparison results are shown in Table 5

Table 5 shows the experimental results and the entriesshown in bold are significantly better than other algorithmsA general trend that can be seen is that HGGWA algorithmhas a promising optimization effect for most of the 7 func-tions

Generally speaking HGGWAoutperforms CCPSO2 on 6out of 7 functions with 100 and 500 dimensions respectivelyAmong them the result of HGGWA is slightly worse than

Complexity 11

Table4Com

paris

onresults

of10

high

-dim

ensio

nalb

enchmarkfunctio

nsby

WOASSA

ALO

and

HGGWA

Functio

nDim

WOA

SSA

ALO

HGGWA

Mean

Std

SRMean

Std

SRMean

Std

SRMean

Std

SR

119891 1100

577

E-96

469

E-96

100

101E-02

124E

-02

0304

E-01

231E-01

0485E-53

241E-52

100

500

724E

-93

298

E-94

100

324E+

04218E+

040

734E

+04

132E

+04

0835E-25

886E-25

100

1000

205

E-89

316E-90

100

119E

+05

104E

+05

0248E+

05212E+

050

175E

-17

273E-17

100

119891 2100

189E

-99

233E-99

100

885E+

00237E+

000

429E+

02325E+

020

401E-33

434E-33

100

500

138E

-99

216E-99

100

337E+

02217E+

020

231E+

03203E+

030

958E

-18

236E-18

100

1000

180E

-99

137E

-98

100

845E+

02343E+

020

325E+

04413E+

040

250E-11

166E

-11

100

119891 3100

384E+

01217E+

000

222E+

0110

5E+0

10

247E+

0113

4E+0

10

455E+

01369E+

010

500

932E

+01

532E+

010

327E+

01233E+

010

392E+

01234E+

010

722E

+01

501E+

000

1000

986E

+01

324E+

010

331E+

0112

8E+0

10

493E+

01399E+

010

831E+

01436E+

000

119891 4100

967E

+01

398E+

010

900E

+02

516E+

020

140E

+03

103E

+03

0682

E+01

125E

-01

100

500

495E+

02314E+

010

718E

+06

146E

+06

0209E+

0713

7E+0

70

367

E+02

823E-01

01000

991E+0

2248E+

020

298E+

0713

5E+0

70

152E

+08

100E

+07

0964

E+02

374E+

010

119891 5100

783E

+05

236E+

050

217E+

0410

3E+0

40

360

E+04

216E+

040

345

E-05

277

E-05

100

500

198E

+07

132E

+07

0429E+

05316E+

050

118E

+06

103E

+06

0267

E-03

438

E-03

80

1000

668E+

07344

E+07

0248E+

0612

3E+0

60

372E+

06349E+

060

833E+

00512E+

000

119891 6100

265E-04

143E

-04

100

112E

+00

103E

+00

0828E-01

268E-01

0234

E-09

317E-10

100

500

394E-04

322E-04

60

497E+

01366

E+01

0113E

+02

101E+0

20

573E-05

433E-06

100

1000

216E-03

136E

-03

0393E+

02278E+

020

177E

+03

233E+

030

108E

-04

162E

-03

80

119891 7100

000

E+00

000

E+00

100

627E+

01436E+

010

354E+

02332E+

020

000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

204

E+03

197E

+03

0334E+

03456E+

030

000

E+00

000

E+00

100

1000

000

E+00

000

E+00

100

569E+

0316

6E+0

30

719E

+03

214E+

030

557E-11

315E-12

100

119891 8100

444

E-15

312E-15

100

515E+

00413E+

000

493E+

00349E+

000

215E-15

348

E-15

100

500

444

E-15

305

E-15

100

126E

+01

100E

+01

012

8E+0

1228E+

010

874E-12

539E-12

100

1000

888

E-16

288

E-15

100

128E

+01

144E

+01

014

6E+0

1203E+

010

159E

-07

163E

-07

100

119891 9100

000

E+00

000

E+00

100

152E

-01

244

E-01

0454E-01

316E-01

0000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

282E+

02213E+

020

392E+

0210

8E+0

20

184E

-16

844

E-17

100

1000

000

E+00

000

E+00

100

110E

+03

134E

+03

0361E+

03117E

+03

0231E-13

329E-14

100

119891 10100

816E-03

432E-03

100

110E

+01

219E+

010

174E

+01

133E

+01

0273E-07

311E-07

100

500

252E-02

199E

-02

80

516E+

01333E+

010

397E+

05296E+

050

812E-05

498

E-06

100

1000

143E

-02

103E

-02

80

100E

+05

213E+

040

696E+

0613

4E+0

60

633E-04

377

E-03

100

12 Complexity

WOASSAALOHGGWA

500 10000Number of iterations

10minus100

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(a) Sphere function

WOASSAALOHGGWA

500 10000Number of iterations

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(b) Schwefel 222 function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(c) Rastrigin function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(d) Griewank function

WOASSAALOHGGWA

200 400 600 800 10000Number of iterations

10minus5

100

105

Obj

ectiv

e fun

ctio

n va

lue

(e) Quartic function

WOASSAALOHGGWA

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

500 10000Number of iterations

(f) Ackley function

Figure 2 Convergence curves of WOA SSA ALO and HGGWA algorithms

Comparative analysis of running time

459

344

053

13

388

239

072

186

537

366

317

285

423

233

122

216

399

217

057

139

259

218

069

121

463

333352

195

0

1

2

3

4

5

6

Runn

ing

Tim

es (s

)

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

SSAWOAGWOHGGWA(a) Comparative analysis of running time

Comparative analysis of iterations

302

233

198221

334290

263237

723

614

491

407 398

258

365312 263

234201

175

214185

166149

186

85

298

132

SSAWOAGWOHGGWA

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

100

200

300

400

500

600

700

Num

ber o

f ite

ratio

ns

(b) Comparative analysis of iterations

Figure 3 The average running times and iterations of SSA WOA GWO and HGGWA on different test functions

Complexity 13

Table5Com

paris

onresults

of7CE

Crsquo200

8benchm

arkfunctio

nsby

fivea

lgorith

ms

Functio

nDim

CCP

SO2

DEw

SAcc

MLC

CEP

US-PS

OHGGWA

Mean

Std

Mean

Std

Mean

Std

Mean

Std

Mean

Std

F1100

773E

-14

323E-14

568E-14

000

E+00

682E-14

232E-14

747E

-01

170E

-01

306

E-17

293E-17

500

300

E-13

796E

-14

209E-09

461E-09

429E-13

331E-14

845E+

01640

E+00

275E-15

213E-15

1000

518E-13

961E-14

878E-03

527E-03

845E-13

500

E-14

553E+

02286E+

0110

6E-10

183E

-11

F2100

608E+

0078

3E+0

0825E+

00532E+

00252E+

01872E+

0018

6E+0

1226E+

00271E+

00239

E+00

500

579E+

01421E+

0175

7E+0

1304

E+00

666

E+01

569E+

00435E+

01551E-01

312E+

01491E+

001000

782E

+01

425E+

0196

0E+0

118

1E+0

010

8E+0

2475E+

00466

E+01

400

E-01

271E+

01566

E+00

F3100

423E+

02865E+

0214

5E+0

2584E+

0114

9E+0

2572E+

01499E+

03535E+

03981E+

01225E-01

500

724E

+02

154E

+02

181E+0

3274E+

0292

4E+0

217

2E+0

2577E+

04804

E+03

496

E+02

623E-01

1000

133E

+03

263E+

02914E

+03

126E

+03

179E

+03

158E

+02

837E+

0515

2E+0

5997

E+02

358

E+01

F4100

398E-02

199E

-01

437E+

0076

5E+0

0438E-13

921E-14

471E+

02594E+

01000

E+00

000

E+00

500

398E-02

199E

-01

364

E+02

523E+

0117

9E-11

631E-11

349E+

03112E

+02

000

E+00

000

E+00

1000

199E

-01

406

E-01

182E

+03

137E

+02

137E

-10

337E-10

758E

+03

151E+0

2493E-11

287

E-12

F5100

345E-03

488E-03

307E-14

786E

-15

341E-14

116E

-14

372E-01

560

E-02

000

E+00

000

E+00

500

118E

-03

461E-03

690E-04

241E-03

212E-13

247E-14

164E

+00

469E-02

216E-16

756E

-17

1000

118E

-03

327E-03

358E-03

573E-03

418E-13

278

E-14

589E+

00391E-01

549E-13

235E-14

F6100

144E

-13

306

E-14

113E

-13

153E

-14

111E-13

786E

-15

206

E+00

440

E-01

321E-15

248

E-15

500

534E-13

861E-14

480E-01

573E-01

534

E-13

701E-14

664

E+00

449E-01

674E-12

439E-12

1000

102E

-12

168E

-13

229E+

00298E-01

106E

-12

768E

-14

189E

+01

249E+

00239E-07

213E-07

F7100

-150E

+03

104E

+01

-137

E+03

246

E+01

-154E

+03

252E+

00-855E

+02

135E

+01

-700e+0

2664

E-01

500

-723E

+03

416E+

01-574

E+03

183E

+02

-743E

+03

803E+

00-351E+0

3210E+

01-883e+0

3354

E+00

1000

-143E

+04

827E+

01-105E

+04

418E+

02-147E

+04

151E+0

1-662E

+03

318E+

01-369E

+04

216E+

02

14 Complexity

CCPSO2 for function F7 in the case of 100-dimensionalthe two algorithms obtain similar optimization results forthe function F6 in the case of 500-dimensional In additionat first glance it might seem that HGGWA and MLCChave similar performance However the HGGWA achievesbetter convergence accuracy than MLCC under the samenumber of iterations In particular the HGGWA algorithmrsquosoptimization results are superior toDEwSAcc andEPUS-PSOin all dimensions The result of HGGWA scaled very wellfrom 100-D to 1000-D on F1 F4 F5 and F6 outperformingDEwSAcc and EPUS-PSO on all the cases

54 Sensitivity Analysis of Nonlinear Adjustment Coefficient 119896This section mainly discusses the effect of nonlinear adjust-ment coefficients on the convergence performance of thealgorithm In the HGGWA algorithm the role of parameter119886 is to balance global exploration capabilities and localexploitation capabilities In (12) the nonlinear adjustmentcoefficient 119896 is a key parameter and is mainly used to controlthe range of variation of the convergence factor Thereforewe selected four different values for numerical experimentsto analyze the effect of nonlinear adjustment coefficients onthe performance of the algorithm The four different valuesare 051152 and the results of the comparison are shown inTable 6 Among them black bold represents the best result ofsolving in several algorithms

In general the value of the nonlinear adjustment coef-ficient 119896 has no significant effect on the performance ofthe HGGWA algorithm On closer inspection one can seethat besides the function 1198913 the optimization performanceof the algorithm is optimal when the nonlinear adjustmentcoefficient 119896 is 05 But for function 1198913 119896 = 15 is the bestchoice For functions 1198917 and 1198919 the HGGWA algorithmobtains the optimal solution 0 in the case of several differentvalues of the coefficient 119896 Therefore for most test functionsthe optimal value of the nonlinear adjustment factor 119896 is 0555 Global Convergence Analysis of HGGWA In theHGGWA algorithm the following four improved strategiesare used to ensure the global convergence of the algorithm(1) initial population strategy based on Opposition-BasedLearning (2) selection operation based on optimal retention(3) crossover operation based on population partitioningmechanism and (4) mutation operation for elite individ-uals

The idea of solving the high-dimensional function opti-mization problem byHGGWAalgorithm is as follows Firstlyfor the initial population strategy the basic GWO generatesthe initial population in a random manner However it willhave a great impact on the search efficiency of the algo-rithm for high-dimensional function optimization problemsThe algorithm cannot effectively search the entire problemsolution space if the initial grey wolf population is clusteredin a small range Therefore the Opposition-Based Learningstrategy is used to initialize the population which can makethe initial population evenly distributed in the solution spacethus laying a good foundation for the global search of thealgorithm Secondly the optimal retention strategy is usedto preserve the optimal individual of the parent population

in the process of population evolution As a result the nextgeneration of population can evolve in the optimal directionIn addition the individuals with higher fitness are selectedby the gambling roulette operation which maintains thedominance relationship between individuals in the popula-tion This dominance relationship makes the algorithm withgood global convergence Thirdly the crossover operation iscompleted under the condition of population partitioningin order to achieve the purpose of dimension reductionand maintain the diversity of the population Finally thesearch direction of the grey wolf population is guided by themutation operation of the elite individuals which effectivelyprevents the algorithm from falling into the local optimalsolution All in all through the improvement of these fourdifferent strategies the HGGWA algorithm shows goodglobal convergence in solving high-dimensional functionoptimization problems

6 Conclusion

In order to overcome the shortcomings of GWO algorithmfor solving large-scale global optimization problems whichare easy to fall into local optimum this paper proposes aHybrid Genetic GreyWolf Algorithm (HGGWA) by integrat-ing three genetic operators into the algorithm The improvedalgorithm initializes the population based on theOpposition-Based Learning strategy for improving the search efficiencyof the algorithm The strategy of population division basedon cooperative coevolution reduces the scale of the problemand the mutation operation of the elite individuals effectivelyprevents the algorithm from falling into the local optimaTheperformance of HGGWA has been evaluated using 10 classi-cal benchmark functions and 7 CECrsquo2008 high-dimensionalfunctions Fromour experimental results several conclusionscan be drawn

The results have shown that it is capable of groupinginteracting variables with great accuracy for the major-ity of the benchmark functions A comparative studyamong the HGGWA and other state-of-the-art algorithmswas conducted and the experimental results showed thatthe HGGWA algorithm exhibits better global convergencewhether it is solving separable functions or nonseparablefunctions(1)By testing 10 classical benchmark functions comparedwith the results of WOA SSA and ALO the HGGWA algo-rithm has been greatly improved in convergence accuracy Inaddition to the Schwefel problem and Rosenbrock functionthe HGGWA algorithm achieves an 80-100 successfulratio for all benchmark functions with 100 dimensionswhich proves the effectiveness of HGGWA for solving high-dimensional functions(2) By comparing with the optimization results on sevenCECrsquo2008 large-scale global optimization problems theresults show that HGGWA algorithm achieves the globaloptimal value for the separable functions F1 F4 and F6but the effect on other nonseparable problems is not verysatisfactory However HGGWAalgorithm still exhibits betterglobal convergence than the other algorithms CCPSO2DewSAcc MLCC and EPUS-PSO on most of the functions

Complexity 15

Table6Com

paris

onof

optim

izationperfo

rmance

ofdifferent119896val

uesfor

HGGWA

Functio

n119896=0

5119896=1

119896=15

119896=2Mean

Std

Mean

Std

Mean

Std

Mean

Std

119891 1843

E-60

536E-60

225E-54

203E-54

558E-53

431E-53

231E-51

316E-51

119891 2289

E-34

231E-34

206

E-33

189E

-33

470E-32

365E-32

576E-30

422E-30

119891 3111E+0

010

2E+0

0314E+

01219E+

01403E-01

238E-01

360

E+01

216E+

01119891 4

978

E+01

836E+

0198

2E+0

174

3E+0

198

1E+0

173

4E+0

198

3E+0

1461E+

01119891 5

218E-03

134E

-03

887E-02

667E-02

781E+0

0613E+

00174E

-02

234E-02

119891 613

1E-03

113E

-03

162E

-03

243E-04

299E-03

104E

-03

184E

-03

134E

-03

119891 7000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 815

8E-14

123E

-15

293E-14

137E

-14

222E-14

149E

-14

251E-14

108E

-14

119891 9000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 10299

E-02

249E-02

338E-02

124E

-03

842E-02

614E-02

729E

-02

813E-03

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 6: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

6 Complexity

D

D

D

Move

C1 a1

C2 a2

a3C3

R

δ

or other wolves

The estimated position of the prey

Figure 1 Position updating in GWO

described in detail in Section 41 Secondly Section 42 intro-duces the nonlinear adjustment strategy for parameter 119886Then three operations of selection crossover and mutationare carried out in Sections 43ndash45 respectively Finally thetime complexity of HGGWA is analyzed in Section 46

41 Initial Population Strategy Based on Opposition-BasedLearning For the swarm intelligence optimization algo-rithm based on population iteration the diversity of initialpopulations lays the foundation for the efficiency of thealgorithm The better diversity of the population will reducethe computation time and improve the global convergence ofthe algorithm [64] However like other algorithms the GWOalgorithm also uses random initialization when generatingpopulations which will have a certain impact on the searchefficiency of the algorithm Opposition-Based Learning is astrategy proposed by Tizhoosh [65] to improve the efficiencyof algorithm search It uses the opposite points of known indi-vidual positions to generate new individual positions therebyincreasing the diversity of search populations Currently theOpposition-Based Learning strategy has been successfullyapplied to multiple swarm optimization algorithms [66 67]The opposite point is defined as follows

Assume that there is an individual 119883 (1199091 1199092 119909119863) inthe population P then the opposite point of the element119909119894 (119894 = 1 2 119863) on each dimension is 1199091015840119894 = 119897119894 + 119906119894 minus119909119894 where 119897119894 and 119906119894 are the lower bound and the upperof the 119894th dimension respectively According to the abovedefinition the Opposition-Based Learning strategy initializesthe population as follows

(a) Randomly initialize the population P calculate theopposite point 1199091015840119894 of each individual position 119909119894 and all theopposite points constitute the opposition population P1015840

(b) Calculate the fitness of each individual in the initialpopulation P and the opposition population P1015840 and arrangethem in descending order

(c) Select the top N grey wolf individuals with the highestfitness as the final initial population

42 Parameters Nonlinear Adjustment Strategy It is sig-nificantly important how to balance the exploration andexploitation for swarm intelligence algorithms In the earlyiteration of the algorithm the powerful exploration capabilityis beneficial for the algorithm to expand the search rangeso that the algorithm can search for the optimal solutionwith greater probability In the later stage of the algorithmrsquositeration the effective exploitation ability can speed up theoptimization process of the algorithm and improve the con-vergence accuracy of the final solution Therefore only whenthe swarm optimization algorithm can better coordinate itsglobal exploration and local exploitation capabilities can ithave strong robustness and faster convergence speed

It can be seen from (2) and (3) that the dynamic changeof the parameter 119886 plays a crucial role in the algorithm Inthe basic GWO the linear decrement strategy of parameter119886 does not reflect the actual convergence process of the algo-rithm Therefore the algorithm does not balance the explo-ration and exploitation capabilities well especially when solv-ing large-scale multimodal function problems In order todynamically adjust the global exploration and local exploita-tion process of the algorithm this paper proposes a nonlinearadjustment strategy for the parameter a which is beneficialforHGGWA to search for the optimal solutionThe improvedparameter nonlinear adjustment equation is as follows

119886 = 119886119898119886119909 + (119886119898119894119899 minus 119886119898119886119909) sdot ( 11 + 119890119905119872119886119909119894119905119890119903 )119896

(12)

where 119886119898119886119909and 119886119898119894119899 are the initial value and the end valueof the parameter 119886 119905 is the current iteration index 119872119886119909119894119905119890119903is the maximum number of iterations and 119896 is nonlinearadjustment coefficient

Thus the variation range of the convergence factor Ais controlled by the nonlinear adjustment of the parameter119886 When |A| gt 1 the grey wolf population expandsthe search range to find better prey which corresponds tothe global exploration of the algorithm when |A| lt 1

Complexity 7

the whole population shrinks the search range thus anencirclement is formed around the prey to complete the finalattacking against the prey which corresponds to the localexploitation process of the algorithm The whole process hasa positive effect on balancing global search and local searchwhich is beneficial to improve the accuracy of the solutionand further accelerate the convergence speed of the algo-rithm

43 Selection Operation Based on Optimal Retention Forthe intelligent optimization algorithm based on populationevolution the evolution of each generation of populationdirectly affects the optimization effect of the algorithm Inorder to inherit the superior individuals in the paternalpopulation to the next generation without being destroyedit is necessary to preserve the good individuals directly Theoptimal retention selection strategy is an effective way topreserve good individuals in genetic algorithms This paperintegrates them into the GWO to improve the efficiencyof the algorithm Assuming that the current population isP(1198831 1198832 119883119863) the fitness of the individual 119883119894 is 119865119894 Thespecific operation is as follows

(a) Calculate the fitness 119865119894 (i = 1 2 D) of eachindividual and arrange them in descending order

(b) Selecting the individuals with the highest fitnessto copy directly into the next generation of popula-tion

(c) Calculate the total fitness S of the remaining indi-viduals and the probability 119901119894 that each individual isselected

S = 119863minus1sum119894=1

119865119894 (119894 = 1 2 119863 minus 1) (13)

119901119894 = 119865119894sum119863minus1119894=1 119865119894 (119894 = 1 2 119863 minus 1) (14)

(d) Calculate the cumulative fitness value 119904119894 for eachindividual and then the selection operation is performed inthemanner of the bet roulette until the number of individualsin the children population is consistent with the parentpopulation

119904119894 = sum119894119894=1 119865119894S(119894 = 1 2 119863 minus 1) (15)

44 Crossover Operation Based on Population Partition-ing Mechanism Due to the decrease of population diver-sity in the late evolution the GWO algorithm is easy tofall into local optimum for solving the large-scale high-dimensional optimization problem in order to overcomethe problems caused by large-scale and complexity andto ensure that the algorithm searches for all solutionspaces

In the HGGWA algorithm the whole population P isdivided into 120592 times 120578 subpopulations 119875119894119895 (119894 = 1 sdot sdot sdot 120592 119895 =1 120578) The optimal size of each subpopulation 119875119894119895 wastested to be 5 times 5 Individuals of each subpopulation were

cross-operated to increase the diversity of the populationThespecific division method is shown below

The initial population 119875[[[[[[[

11990911 11990912 sdot sdot sdot 119909111988911990921 11990922 sdot sdot sdot 1199092119889 d1199091198751 1199091198752 sdot sdot sdot 119909119875119889

]]]]]]]Subpopulations 119875119894119895

[[[[[

11990911 sdot sdot sdot 11990915 d11990951 sdot sdot sdot 11990955]]]]]sdot sdot sdot [[[[[

1199091(119889minus4) sdot sdot sdot 1199091119889 d1199095(119889minus4) sdot sdot sdot 1199095119889]]]]]11987511 sdot sdot sdot 1198751120578

[[[[[

119909(119875minus4)1 sdot sdot sdot 119909(119875minus4)5 d1199091198751 sdot sdot sdot 1199091198755

]]]]]sdot sdot sdot [[[[[

119909(119875minus4)(119889minus4) sdot sdot sdot 119909(119875minus4)119889 d119909119875(119889minus4) sdot sdot sdot 119909119875119889

]]]]]119875]1 sdot sdot sdot 119875]120578

(16)

In genetic algorithms cross-operation has a very impor-tant role and is the main way to generate new individuals Inthe improved algorithm of this paper each individual in thesubpopulation is cross-operated in a linear crossover mannerGenerating a corresponding random number 119903119894 isin(01) foreach individual 119909119894 in the subpopulation When the randomnumber 119903119894 is less than the crossover probability 119875119888 thecorresponding individual 119909119894 is paired for cross-operation

An example is as follows Crossover (p1 p2)(a) Generate a random number 120582 isin (0 1)(b) Two children 1198881(11988811 1198881119863) 1198882(11988821 1198882119863) are gener-

ated by two parents 1199011(11990111 1199011119863) and 1199012(11990121 1199012119863)1198881119894 = 1205821199011119894 + (1 minus 120582) 1199012119894 119894 = 1 2 119863 (17)

1198882119894 = 1205821199012119894 + (1 minus 120582) 1199011119894 119894 = 1 2 119863 (18)

45 Mutation Operation for Elite Individuals Due to theexistence of the selection operation based on the optimalpreservation strategy the grey wolf individuals in the wholepopulation are concentrated in a small optimal region inthe later stage of the iterative process which easily leadsto the loss of population diversity If the current optimalindividual is a locally optimal solution then the algorithm iseasy to fall into local optimum especially when solving high-dimensional multimodal functions To this end this paperintroduces the mutation operator in the HGGWA algorithmto perform mutation operations on elite individuals in thepopulation The specific operations are as follows

Assume that the optimal individual is 119909119894(1199091 1199092 119909119889)and the mutation operation is performed on 119909119894 with themutation probability 119875119898 That is select a gene 119909119896 from theoptimal individual with probability 119875119898 instead of the gene 119909119896with a random number between upper and lower bounds to

8 Complexity

(1) Initialize parameters a A C population size N 119875119888 and 119875119898(2) Initialize population 119875 using OBL the strategy(3) t=0(4) While tltMa119909119894119905119890119903(5) Calculate the fitness of all search agents(6) 119883120572= the best search agent(7) 119883120573= the second best search agent(8) 119883120575= the third best search agent(9) for i = 1N(10) Update the position of all search agents by equation (11)(11) end for(12) newP1larr997888P except the119883120572(13) for i = 1N-1(14) Generate newP2 by the Roulette Wheel Selection on newP1(15) end for(16) Plarr997888newP2 and119883120572(17) Generate subPs by the population partitioning mechanism on P(18) Select the individuals by crossover probability 119875119888(19) for i = 1Nlowast119875119888(20) Crossover each search agent by equation (17) and (18)(21) end for(22) Generate the1198831015840120572 1198831015840120573 and1198831015840120575 by equation (19) on mutation probability 119875119898(23) t = t+1(24) end while(25) Return 119883120572Algorithm 1 Pseudo code of the HGGWA

generate a new individual 1199091015840119894 = (11990910158401 11990910158402 1199091015840119889) The specificoperation is as follows

1199091015840119894 = 119897 + 120582 lowast (119906 minus 119897) 119894 = 119896119909119894 119894 = 119896 (19)

where 120582 is a random number in [0 1] and 119897 and 119906 are thelower and upper bounds of the individual 119909119894 respectively46 Pseudo Code of the HGGWA and Time ComplexityAnalysis This section describes how to calculate an upperbound for the total number of fitness evaluations (FE)required by HGGWA As shown in Algorithm 1 the compu-tational complexity of HGGWA in one generation is mainlydominated by the position updating of all search agents inline (10) The position updating requires a time complexityof O(Nlowastdimlowast119872119886119909119894119905119890119903)(N is the population size dim is thedimensions of each benchmark function and 119872119886119909119894119905119890119903 isthe maximum number of iterations of HGGWA) to obtainthe needed parameters and to finish the whole positionupdating progress of all search agents In addition theprocess of crossover operations is another time-consumingstep in line (20) It needs a time complexity of O(2lowast120592 lowast120578 lowast 119872119886119909119894119905119890119903)(120592 lowast 120578 is the total number of subpopulations120592=N5 120578=dim5) to complete the crossing of individuals ineach subpopulations according to the crossover probability119875119888 It should be noted that this is only the worst casecomputational complexity If there are only two individualsin each subpopulation that need to be crossed then theminimum amount of computation is O(120592 lowast 120578 lowast 119872119886119909119894119905119890119903)

Therefore the maximum computational complexity causedby the twodominant processes in the algorithm ismax 119874(Nlowast119889119894119898lowast119872119886119909119894119905119890119903) 119874(2lowast120592lowast120578lowast119872119886119909119894119905119890119903) (120592 = N5 120578 = 1198891198941198985)in one generation ie 119874(N lowast 119889119894119898 lowast119872119886119909119894119905119890119903)5 Numerical Experiments and Analysis

In this section the proposed HGGWA algorithm will beevaluated on both classical benchmark functions [68] andthe suite of test functions provided by CEC2008 SpecialSession on large-scale global optimization The algorithmsused for comparison include not only conventional EAs butalso other CC optimization algorithms Experimental resultsare provided to analyze the performance of HGGWA in thecontext of large-scale optimization problems In addition thesensitivity analysis of nonlinear adjustment coefficient andthe global convergence of HGGWA are discussed in Sections54 and 55 respectively

51 Benchmark Functions and Performance Measures Inorder to verify the ability ofHGGWAalgorithm to solve high-dimensional complex functions 10 general high-dimensionalbenchmark functions (100D 500D 1000D) were selected foroptimization test Among them 1198911 sim 1198916 are unimodalfunctions which are used to test the local searchability ofthe algorithm 1198917 sim 11989110 are multimodal functions which areused to test the global searchability of the algorithm Thesetest functions have multiple local optima points with unevendistribution nonconvexity and strong oscillation which arevery difficult to converge to the global optimal solutionespecially in the case of being high-dimensional The specific

Complexity 9

Table 2 10 high-dimensional benchmark functions

Functions Function formula Range 119891119898119894119899 Accuracy

Sphere 1198911 (119909) = 119889sum119894=1

1199092119894 [minus100 100] 0 1 times 10minus8Schwefelrsquos 222 1198912 (119909) = 119889sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816 + 119889prod119894=1

10038161003816100381610038161199091198941003816100381610038161003816 [minus10 10] 0 1 times 10minus8Schwefelrsquos 221 1198913 (119909) = max11989410038161003816100381610038161199091198941003816100381610038161003816 1 le 119894 le 119889 [minus100 100] 0 1 times 10minus2Rosenbrock 1198914 (119909) = 119889minus1sum

119894=1

[100 (119909119894+1 minus 1199092119894 )2 + (119909119894 minus 1)2] [minus30 30] 0 1 times 100Schwefelrsquos 12 1198915 (119909) = 119889sum

119894=1

( 119894sum119895=1

119909119895)2 [minus100 100] 0 1 times 10minus3Quartic 1198916 (119909) = 119889sum

119894=1

1198941199094119894 + 119903119886119899119889119900119898[0 1) [minus128 128] 0 1 times 10minus4Rastrigin 1198917 (119909) = 119889sum

119894=1

[1199092119894 minus 10 cos (2120587119909119894) + 10] [minus512 512] 0 1 times 10minus8Ackley 1198918 = minus20 exp(minus02radic 1119889

119889sum119894=1

1199092119894) minus exp(1119889119889sum119894=1

cos (2120587119909119894)) + 20 + 119890 [minus32 32] 0 1 times 10minus5Griewank 1198919 = 14000

119889sum119894=1

1199092119894 minus 119889prod119894=1

cos( 119909119894radic119894) + 1 [minus600 600] 0 1 times 10minus5

Penalized 1

11989110 = 120587119889 10 sin (1205871199101) +119889minus1sum119894=1

(119910119894 minus 1)2 [1 + sin2 (120587119910119894+1)] + (119910119899 minus 1)2

[minus50 50] 0 1 times 10minus2+ 119889sum119894=1

119906 (119909119894 10 100 4)119910119894 = 1 + 119909119894 + 14

119906(119909119894 119886 119896119898)

119896(119909119894 minus 119886)119898 119909119894 gt 1198860119896(minus119909119894 minus 119886)119898 119909119894 lt minus119886

characteristics of the 10 benchmark functions are shown inTable 2

In order to show the optimization effect of the algorithmmore clearly this paper selects two indicators to evaluate theperformance of the algorithm [69] The first is the solutionaccuracy (AC) which reflects the difference between theoptimal results of the algorithm and the theoretical optimalvalue In an experiment if the final convergence result ofthe algorithm is less than the AC then the optimization isconsidered successful Assuming that the optimal value of acertain optimization is119883119900119901119905 and the theoretical optimal valueis119883119898119894119899 then the AC is calculated as follows

119860119862 = 10038161003816100381610038161003816119883119900119901119905 minus 11988311989811989411989910038161003816100381610038161003816 (20)

The second is the successful ratio (SR) which reflectsthe proportion of the number of successful optimizationsto the total number of optimizations under the conditionthat the algorithm has a certain accuracy Assume that inthe 119879 iterations of the algorithm the number of successfuloptimizations is 119905 then the SR is calculated as follows

119878119877 = 119905119879 times 100 (21)

52 Results on Classical Benchmark Functions In this sec-tion a set of simulation tests were performed in order toverify the effectiveness of HGGWA Firstly the 10 high-dimensional benchmark functions in Table 2 are optimizedbyHGGWAalgorithm proposed in this paper and the resultsare compared with WOA SSA and ALO In addition therunning time of several algorithms is compared under thegiven convergence accuracy

521 Parameters Settings for the Compared Algorithms In allexperiments the values of the common parameters such asthe maximum number of iterations (119872119886119909119894119905119890119903) the dimensionof the functions (D) and the population sizes (N) were cho-sen the same For all test problems we focus on investigatingthe optimization performance of the proposed method onproblemswithD= 100 500 and 1000Themaximumnumberof iterations is set to 1000 and the population size is set to50 The parameter setting of WOA SSA and ALO is derivedfrom the original literature [70ndash72] The optimal parametersettings of the HGGWA algorithm proposed in this paper areshown in Table 3

For each experiment of an algorithm on a benchmarkfunction 30 independent runs are performed to obtain a

10 Complexity

Table 3 Parameter settings of the HGGWA

Parameters Definition ValueN Population size 50119875119888 Cross probability 08119875119898 Mutation probability 001119872119886119909119894119905119890119903 Maximum number of iterations 1000119896 Nonlinear adjustment coefficient 05

fair comparison among the different algorithms The optimalvalue the worst value the average value and the standarddeviation of each algorithm optimization are recorded andthen the optimization successful ratio (SR) is calculatedAll programs were coded in Matlab 2017b (Win64) andexecuted on a Lenovo computer with Intel (R) Core I5-6300HQ 8G ROM 230GHz under Windows 10 operatingsystem

522 Comparison with State-of-the-Art Metaheuristic Algo-rithms In order to verify the efficiency of the HGGWAalgorithm several state-of-the-art metaheuristic algorithmsrecently proposed are compared with their optimizationresults These algorithms are Whale Optimization Algorithm(WOA) Salp Swarm Algorithm (SSA) and Ant Lion Opti-mization Algorithm (ALO) The test was performed usingthe same 10 high-dimensional functions in Table 2 and theresults of the comparison are shown in Table 4

From Table 4 it can be known that the convergenceaccuracy and optimization successful ratio of the HGGWAalgorithm are significantly higher than the other three algo-rithms for most of the test functions In the case of 100dimensions HGGWA algorithm can converge to the globaloptimal solution for the other nine test functions at one timein addition to function 3 And the algorithm has a successfulratio of 100 for the nine functions under the condition ofcertain convergence accuracy The convergence result of theWOA algorithm is better than the other three algorithms forfunctions 1198911 and 1198912 and its robustness is excellent With theincrease of function dimension the convergence precisionof several algorithms decreases slightly but the optimizationresults of HGGWA algorithm are better than those of WOASSA and ALO In general the HGGWA algorithm exhibitsbetter optimization effects than the WOA SSA and ALOalgorithms in the 100- 500- and 1000-dimensional testfunctions which proves the effectiveness of the HGGWAalgorithm for solving high-dimensional complex functions

Generally speaking the HGGWA algorithm performsbetter than other algorithms in most test functions Inorder to compare the convergence performance of the fouralgorithms more clearly the convergence curves of the fouralgorithms on Sphere Schwefelrsquos 222 Rastrigin GriewankQuartic and Ackley functions with D=100 are shown inFigure 2 Sphere and Schwefelrsquos 222 represent the two mostbasic unimodal functions Rastrigin and Griewank representthe two most basic multimodal functions It is seen thatHGGWA has higher precision and faster convergence speedthan other algorithms

523 Comparative Analysis of Running Time To evaluate theactual runtime of the several compared algorithms includingSSAWOAGWO andHGGWA their average running times(in seconds s) from 30 runs on function 1198911 1198912 1198916 1198917 1198918 1198919and 11989110 are plotted in Figure 3 The convergence accuracy ofeach test function has been described in Table 2

Figure 3 shows the convergence behavior of differentalgorithms Each point on the plot was calculated by takingthe average of 30 independent runs As can be seen fromFigure 3 several algorithms show different effects under thesame calculation accuracy For the functions 1198911 1198912 1198917 11989181198919 the GWO algorithm has the fastest convergence rateas a result of the advantages of the GWO algorithm itselfHowever for the test functions 1198916 and 11989110 the runningtime of the HGGWA algorithm is the shortest Throughfurther analysis it is known that HGGWA algorithm adds alittle computing time compared to GWO algorithm becauseof several improved strategies such as selection crossoverand mutation However the convergence speed of HGGWAalgorithm is still better than SSA and WOA algorithmsOverall the convergence speed of the HGGWA algorithmperforms better in several algorithms compared Moreoverthe running time of different test functions is very small andit is kept between 1s and 3s which shows that the HGGWAalgorithm has excellent stability

53 Results on CECrsquo2008 Benchmark Functions This sectionprovides an analysis of the effectiveness of HGGWA in termsof the CECrsquo2008 large-scale benchmark functions and acomparison with four powerful algorithms which have beenproven to be effective in solving large-scale optimizationproblems Experimental results are provided to analyze theperformance of HGGWA for large-scale optimization prob-lems

We tested our proposed algorithms with CECrsquo2008 func-tions for 100 500 and 1000 dimensions and the means ofthe best fitness value over 50 runs were recorded WhileF1 (Shifted Sphere) F4 (Shifted Rastrigin) and F6 (ShiftedAckley) are separable functions F2 (Schwefel Problem) F3(Shifted Rosenbrock) F5 (Shifted Griewank) and F7 (FastFractal) are nonseparable presenting a greater challenge toany algorithm that is prone to variable interactions Theperformance of HGGWA is compared with other algorithmsCCPSO2 [31] DEwSAcc [5] MLCC [38] and EPUS-PSO[49] for 100 500 and 1000 dimensions respectively Themaximum number of fitness evaluations (FEs) was calculatedby the following formula FEs = 5000timesD where D is thenumber of dimensions In order to reflect the fairness of thecomparison results the optimization results of the other fouralgorithms are directly derived from the original literatureThe specific comparison results are shown in Table 5

Table 5 shows the experimental results and the entriesshown in bold are significantly better than other algorithmsA general trend that can be seen is that HGGWA algorithmhas a promising optimization effect for most of the 7 func-tions

Generally speaking HGGWAoutperforms CCPSO2 on 6out of 7 functions with 100 and 500 dimensions respectivelyAmong them the result of HGGWA is slightly worse than

Complexity 11

Table4Com

paris

onresults

of10

high

-dim

ensio

nalb

enchmarkfunctio

nsby

WOASSA

ALO

and

HGGWA

Functio

nDim

WOA

SSA

ALO

HGGWA

Mean

Std

SRMean

Std

SRMean

Std

SRMean

Std

SR

119891 1100

577

E-96

469

E-96

100

101E-02

124E

-02

0304

E-01

231E-01

0485E-53

241E-52

100

500

724E

-93

298

E-94

100

324E+

04218E+

040

734E

+04

132E

+04

0835E-25

886E-25

100

1000

205

E-89

316E-90

100

119E

+05

104E

+05

0248E+

05212E+

050

175E

-17

273E-17

100

119891 2100

189E

-99

233E-99

100

885E+

00237E+

000

429E+

02325E+

020

401E-33

434E-33

100

500

138E

-99

216E-99

100

337E+

02217E+

020

231E+

03203E+

030

958E

-18

236E-18

100

1000

180E

-99

137E

-98

100

845E+

02343E+

020

325E+

04413E+

040

250E-11

166E

-11

100

119891 3100

384E+

01217E+

000

222E+

0110

5E+0

10

247E+

0113

4E+0

10

455E+

01369E+

010

500

932E

+01

532E+

010

327E+

01233E+

010

392E+

01234E+

010

722E

+01

501E+

000

1000

986E

+01

324E+

010

331E+

0112

8E+0

10

493E+

01399E+

010

831E+

01436E+

000

119891 4100

967E

+01

398E+

010

900E

+02

516E+

020

140E

+03

103E

+03

0682

E+01

125E

-01

100

500

495E+

02314E+

010

718E

+06

146E

+06

0209E+

0713

7E+0

70

367

E+02

823E-01

01000

991E+0

2248E+

020

298E+

0713

5E+0

70

152E

+08

100E

+07

0964

E+02

374E+

010

119891 5100

783E

+05

236E+

050

217E+

0410

3E+0

40

360

E+04

216E+

040

345

E-05

277

E-05

100

500

198E

+07

132E

+07

0429E+

05316E+

050

118E

+06

103E

+06

0267

E-03

438

E-03

80

1000

668E+

07344

E+07

0248E+

0612

3E+0

60

372E+

06349E+

060

833E+

00512E+

000

119891 6100

265E-04

143E

-04

100

112E

+00

103E

+00

0828E-01

268E-01

0234

E-09

317E-10

100

500

394E-04

322E-04

60

497E+

01366

E+01

0113E

+02

101E+0

20

573E-05

433E-06

100

1000

216E-03

136E

-03

0393E+

02278E+

020

177E

+03

233E+

030

108E

-04

162E

-03

80

119891 7100

000

E+00

000

E+00

100

627E+

01436E+

010

354E+

02332E+

020

000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

204

E+03

197E

+03

0334E+

03456E+

030

000

E+00

000

E+00

100

1000

000

E+00

000

E+00

100

569E+

0316

6E+0

30

719E

+03

214E+

030

557E-11

315E-12

100

119891 8100

444

E-15

312E-15

100

515E+

00413E+

000

493E+

00349E+

000

215E-15

348

E-15

100

500

444

E-15

305

E-15

100

126E

+01

100E

+01

012

8E+0

1228E+

010

874E-12

539E-12

100

1000

888

E-16

288

E-15

100

128E

+01

144E

+01

014

6E+0

1203E+

010

159E

-07

163E

-07

100

119891 9100

000

E+00

000

E+00

100

152E

-01

244

E-01

0454E-01

316E-01

0000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

282E+

02213E+

020

392E+

0210

8E+0

20

184E

-16

844

E-17

100

1000

000

E+00

000

E+00

100

110E

+03

134E

+03

0361E+

03117E

+03

0231E-13

329E-14

100

119891 10100

816E-03

432E-03

100

110E

+01

219E+

010

174E

+01

133E

+01

0273E-07

311E-07

100

500

252E-02

199E

-02

80

516E+

01333E+

010

397E+

05296E+

050

812E-05

498

E-06

100

1000

143E

-02

103E

-02

80

100E

+05

213E+

040

696E+

0613

4E+0

60

633E-04

377

E-03

100

12 Complexity

WOASSAALOHGGWA

500 10000Number of iterations

10minus100

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(a) Sphere function

WOASSAALOHGGWA

500 10000Number of iterations

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(b) Schwefel 222 function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(c) Rastrigin function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(d) Griewank function

WOASSAALOHGGWA

200 400 600 800 10000Number of iterations

10minus5

100

105

Obj

ectiv

e fun

ctio

n va

lue

(e) Quartic function

WOASSAALOHGGWA

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

500 10000Number of iterations

(f) Ackley function

Figure 2 Convergence curves of WOA SSA ALO and HGGWA algorithms

Comparative analysis of running time

459

344

053

13

388

239

072

186

537

366

317

285

423

233

122

216

399

217

057

139

259

218

069

121

463

333352

195

0

1

2

3

4

5

6

Runn

ing

Tim

es (s

)

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

SSAWOAGWOHGGWA(a) Comparative analysis of running time

Comparative analysis of iterations

302

233

198221

334290

263237

723

614

491

407 398

258

365312 263

234201

175

214185

166149

186

85

298

132

SSAWOAGWOHGGWA

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

100

200

300

400

500

600

700

Num

ber o

f ite

ratio

ns

(b) Comparative analysis of iterations

Figure 3 The average running times and iterations of SSA WOA GWO and HGGWA on different test functions

Complexity 13

Table5Com

paris

onresults

of7CE

Crsquo200

8benchm

arkfunctio

nsby

fivea

lgorith

ms

Functio

nDim

CCP

SO2

DEw

SAcc

MLC

CEP

US-PS

OHGGWA

Mean

Std

Mean

Std

Mean

Std

Mean

Std

Mean

Std

F1100

773E

-14

323E-14

568E-14

000

E+00

682E-14

232E-14

747E

-01

170E

-01

306

E-17

293E-17

500

300

E-13

796E

-14

209E-09

461E-09

429E-13

331E-14

845E+

01640

E+00

275E-15

213E-15

1000

518E-13

961E-14

878E-03

527E-03

845E-13

500

E-14

553E+

02286E+

0110

6E-10

183E

-11

F2100

608E+

0078

3E+0

0825E+

00532E+

00252E+

01872E+

0018

6E+0

1226E+

00271E+

00239

E+00

500

579E+

01421E+

0175

7E+0

1304

E+00

666

E+01

569E+

00435E+

01551E-01

312E+

01491E+

001000

782E

+01

425E+

0196

0E+0

118

1E+0

010

8E+0

2475E+

00466

E+01

400

E-01

271E+

01566

E+00

F3100

423E+

02865E+

0214

5E+0

2584E+

0114

9E+0

2572E+

01499E+

03535E+

03981E+

01225E-01

500

724E

+02

154E

+02

181E+0

3274E+

0292

4E+0

217

2E+0

2577E+

04804

E+03

496

E+02

623E-01

1000

133E

+03

263E+

02914E

+03

126E

+03

179E

+03

158E

+02

837E+

0515

2E+0

5997

E+02

358

E+01

F4100

398E-02

199E

-01

437E+

0076

5E+0

0438E-13

921E-14

471E+

02594E+

01000

E+00

000

E+00

500

398E-02

199E

-01

364

E+02

523E+

0117

9E-11

631E-11

349E+

03112E

+02

000

E+00

000

E+00

1000

199E

-01

406

E-01

182E

+03

137E

+02

137E

-10

337E-10

758E

+03

151E+0

2493E-11

287

E-12

F5100

345E-03

488E-03

307E-14

786E

-15

341E-14

116E

-14

372E-01

560

E-02

000

E+00

000

E+00

500

118E

-03

461E-03

690E-04

241E-03

212E-13

247E-14

164E

+00

469E-02

216E-16

756E

-17

1000

118E

-03

327E-03

358E-03

573E-03

418E-13

278

E-14

589E+

00391E-01

549E-13

235E-14

F6100

144E

-13

306

E-14

113E

-13

153E

-14

111E-13

786E

-15

206

E+00

440

E-01

321E-15

248

E-15

500

534E-13

861E-14

480E-01

573E-01

534

E-13

701E-14

664

E+00

449E-01

674E-12

439E-12

1000

102E

-12

168E

-13

229E+

00298E-01

106E

-12

768E

-14

189E

+01

249E+

00239E-07

213E-07

F7100

-150E

+03

104E

+01

-137

E+03

246

E+01

-154E

+03

252E+

00-855E

+02

135E

+01

-700e+0

2664

E-01

500

-723E

+03

416E+

01-574

E+03

183E

+02

-743E

+03

803E+

00-351E+0

3210E+

01-883e+0

3354

E+00

1000

-143E

+04

827E+

01-105E

+04

418E+

02-147E

+04

151E+0

1-662E

+03

318E+

01-369E

+04

216E+

02

14 Complexity

CCPSO2 for function F7 in the case of 100-dimensionalthe two algorithms obtain similar optimization results forthe function F6 in the case of 500-dimensional In additionat first glance it might seem that HGGWA and MLCChave similar performance However the HGGWA achievesbetter convergence accuracy than MLCC under the samenumber of iterations In particular the HGGWA algorithmrsquosoptimization results are superior toDEwSAcc andEPUS-PSOin all dimensions The result of HGGWA scaled very wellfrom 100-D to 1000-D on F1 F4 F5 and F6 outperformingDEwSAcc and EPUS-PSO on all the cases

54 Sensitivity Analysis of Nonlinear Adjustment Coefficient 119896This section mainly discusses the effect of nonlinear adjust-ment coefficients on the convergence performance of thealgorithm In the HGGWA algorithm the role of parameter119886 is to balance global exploration capabilities and localexploitation capabilities In (12) the nonlinear adjustmentcoefficient 119896 is a key parameter and is mainly used to controlthe range of variation of the convergence factor Thereforewe selected four different values for numerical experimentsto analyze the effect of nonlinear adjustment coefficients onthe performance of the algorithm The four different valuesare 051152 and the results of the comparison are shown inTable 6 Among them black bold represents the best result ofsolving in several algorithms

In general the value of the nonlinear adjustment coef-ficient 119896 has no significant effect on the performance ofthe HGGWA algorithm On closer inspection one can seethat besides the function 1198913 the optimization performanceof the algorithm is optimal when the nonlinear adjustmentcoefficient 119896 is 05 But for function 1198913 119896 = 15 is the bestchoice For functions 1198917 and 1198919 the HGGWA algorithmobtains the optimal solution 0 in the case of several differentvalues of the coefficient 119896 Therefore for most test functionsthe optimal value of the nonlinear adjustment factor 119896 is 0555 Global Convergence Analysis of HGGWA In theHGGWA algorithm the following four improved strategiesare used to ensure the global convergence of the algorithm(1) initial population strategy based on Opposition-BasedLearning (2) selection operation based on optimal retention(3) crossover operation based on population partitioningmechanism and (4) mutation operation for elite individ-uals

The idea of solving the high-dimensional function opti-mization problem byHGGWAalgorithm is as follows Firstlyfor the initial population strategy the basic GWO generatesthe initial population in a random manner However it willhave a great impact on the search efficiency of the algo-rithm for high-dimensional function optimization problemsThe algorithm cannot effectively search the entire problemsolution space if the initial grey wolf population is clusteredin a small range Therefore the Opposition-Based Learningstrategy is used to initialize the population which can makethe initial population evenly distributed in the solution spacethus laying a good foundation for the global search of thealgorithm Secondly the optimal retention strategy is usedto preserve the optimal individual of the parent population

in the process of population evolution As a result the nextgeneration of population can evolve in the optimal directionIn addition the individuals with higher fitness are selectedby the gambling roulette operation which maintains thedominance relationship between individuals in the popula-tion This dominance relationship makes the algorithm withgood global convergence Thirdly the crossover operation iscompleted under the condition of population partitioningin order to achieve the purpose of dimension reductionand maintain the diversity of the population Finally thesearch direction of the grey wolf population is guided by themutation operation of the elite individuals which effectivelyprevents the algorithm from falling into the local optimalsolution All in all through the improvement of these fourdifferent strategies the HGGWA algorithm shows goodglobal convergence in solving high-dimensional functionoptimization problems

6 Conclusion

In order to overcome the shortcomings of GWO algorithmfor solving large-scale global optimization problems whichare easy to fall into local optimum this paper proposes aHybrid Genetic GreyWolf Algorithm (HGGWA) by integrat-ing three genetic operators into the algorithm The improvedalgorithm initializes the population based on theOpposition-Based Learning strategy for improving the search efficiencyof the algorithm The strategy of population division basedon cooperative coevolution reduces the scale of the problemand the mutation operation of the elite individuals effectivelyprevents the algorithm from falling into the local optimaTheperformance of HGGWA has been evaluated using 10 classi-cal benchmark functions and 7 CECrsquo2008 high-dimensionalfunctions Fromour experimental results several conclusionscan be drawn

The results have shown that it is capable of groupinginteracting variables with great accuracy for the major-ity of the benchmark functions A comparative studyamong the HGGWA and other state-of-the-art algorithmswas conducted and the experimental results showed thatthe HGGWA algorithm exhibits better global convergencewhether it is solving separable functions or nonseparablefunctions(1)By testing 10 classical benchmark functions comparedwith the results of WOA SSA and ALO the HGGWA algo-rithm has been greatly improved in convergence accuracy Inaddition to the Schwefel problem and Rosenbrock functionthe HGGWA algorithm achieves an 80-100 successfulratio for all benchmark functions with 100 dimensionswhich proves the effectiveness of HGGWA for solving high-dimensional functions(2) By comparing with the optimization results on sevenCECrsquo2008 large-scale global optimization problems theresults show that HGGWA algorithm achieves the globaloptimal value for the separable functions F1 F4 and F6but the effect on other nonseparable problems is not verysatisfactory However HGGWAalgorithm still exhibits betterglobal convergence than the other algorithms CCPSO2DewSAcc MLCC and EPUS-PSO on most of the functions

Complexity 15

Table6Com

paris

onof

optim

izationperfo

rmance

ofdifferent119896val

uesfor

HGGWA

Functio

n119896=0

5119896=1

119896=15

119896=2Mean

Std

Mean

Std

Mean

Std

Mean

Std

119891 1843

E-60

536E-60

225E-54

203E-54

558E-53

431E-53

231E-51

316E-51

119891 2289

E-34

231E-34

206

E-33

189E

-33

470E-32

365E-32

576E-30

422E-30

119891 3111E+0

010

2E+0

0314E+

01219E+

01403E-01

238E-01

360

E+01

216E+

01119891 4

978

E+01

836E+

0198

2E+0

174

3E+0

198

1E+0

173

4E+0

198

3E+0

1461E+

01119891 5

218E-03

134E

-03

887E-02

667E-02

781E+0

0613E+

00174E

-02

234E-02

119891 613

1E-03

113E

-03

162E

-03

243E-04

299E-03

104E

-03

184E

-03

134E

-03

119891 7000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 815

8E-14

123E

-15

293E-14

137E

-14

222E-14

149E

-14

251E-14

108E

-14

119891 9000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 10299

E-02

249E-02

338E-02

124E

-03

842E-02

614E-02

729E

-02

813E-03

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 7: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

Complexity 7

the whole population shrinks the search range thus anencirclement is formed around the prey to complete the finalattacking against the prey which corresponds to the localexploitation process of the algorithm The whole process hasa positive effect on balancing global search and local searchwhich is beneficial to improve the accuracy of the solutionand further accelerate the convergence speed of the algo-rithm

43 Selection Operation Based on Optimal Retention Forthe intelligent optimization algorithm based on populationevolution the evolution of each generation of populationdirectly affects the optimization effect of the algorithm Inorder to inherit the superior individuals in the paternalpopulation to the next generation without being destroyedit is necessary to preserve the good individuals directly Theoptimal retention selection strategy is an effective way topreserve good individuals in genetic algorithms This paperintegrates them into the GWO to improve the efficiencyof the algorithm Assuming that the current population isP(1198831 1198832 119883119863) the fitness of the individual 119883119894 is 119865119894 Thespecific operation is as follows

(a) Calculate the fitness 119865119894 (i = 1 2 D) of eachindividual and arrange them in descending order

(b) Selecting the individuals with the highest fitnessto copy directly into the next generation of popula-tion

(c) Calculate the total fitness S of the remaining indi-viduals and the probability 119901119894 that each individual isselected

S = 119863minus1sum119894=1

119865119894 (119894 = 1 2 119863 minus 1) (13)

119901119894 = 119865119894sum119863minus1119894=1 119865119894 (119894 = 1 2 119863 minus 1) (14)

(d) Calculate the cumulative fitness value 119904119894 for eachindividual and then the selection operation is performed inthemanner of the bet roulette until the number of individualsin the children population is consistent with the parentpopulation

119904119894 = sum119894119894=1 119865119894S(119894 = 1 2 119863 minus 1) (15)

44 Crossover Operation Based on Population Partition-ing Mechanism Due to the decrease of population diver-sity in the late evolution the GWO algorithm is easy tofall into local optimum for solving the large-scale high-dimensional optimization problem in order to overcomethe problems caused by large-scale and complexity andto ensure that the algorithm searches for all solutionspaces

In the HGGWA algorithm the whole population P isdivided into 120592 times 120578 subpopulations 119875119894119895 (119894 = 1 sdot sdot sdot 120592 119895 =1 120578) The optimal size of each subpopulation 119875119894119895 wastested to be 5 times 5 Individuals of each subpopulation were

cross-operated to increase the diversity of the populationThespecific division method is shown below

The initial population 119875[[[[[[[

11990911 11990912 sdot sdot sdot 119909111988911990921 11990922 sdot sdot sdot 1199092119889 d1199091198751 1199091198752 sdot sdot sdot 119909119875119889

]]]]]]]Subpopulations 119875119894119895

[[[[[

11990911 sdot sdot sdot 11990915 d11990951 sdot sdot sdot 11990955]]]]]sdot sdot sdot [[[[[

1199091(119889minus4) sdot sdot sdot 1199091119889 d1199095(119889minus4) sdot sdot sdot 1199095119889]]]]]11987511 sdot sdot sdot 1198751120578

[[[[[

119909(119875minus4)1 sdot sdot sdot 119909(119875minus4)5 d1199091198751 sdot sdot sdot 1199091198755

]]]]]sdot sdot sdot [[[[[

119909(119875minus4)(119889minus4) sdot sdot sdot 119909(119875minus4)119889 d119909119875(119889minus4) sdot sdot sdot 119909119875119889

]]]]]119875]1 sdot sdot sdot 119875]120578

(16)

In genetic algorithms cross-operation has a very impor-tant role and is the main way to generate new individuals Inthe improved algorithm of this paper each individual in thesubpopulation is cross-operated in a linear crossover mannerGenerating a corresponding random number 119903119894 isin(01) foreach individual 119909119894 in the subpopulation When the randomnumber 119903119894 is less than the crossover probability 119875119888 thecorresponding individual 119909119894 is paired for cross-operation

An example is as follows Crossover (p1 p2)(a) Generate a random number 120582 isin (0 1)(b) Two children 1198881(11988811 1198881119863) 1198882(11988821 1198882119863) are gener-

ated by two parents 1199011(11990111 1199011119863) and 1199012(11990121 1199012119863)1198881119894 = 1205821199011119894 + (1 minus 120582) 1199012119894 119894 = 1 2 119863 (17)

1198882119894 = 1205821199012119894 + (1 minus 120582) 1199011119894 119894 = 1 2 119863 (18)

45 Mutation Operation for Elite Individuals Due to theexistence of the selection operation based on the optimalpreservation strategy the grey wolf individuals in the wholepopulation are concentrated in a small optimal region inthe later stage of the iterative process which easily leadsto the loss of population diversity If the current optimalindividual is a locally optimal solution then the algorithm iseasy to fall into local optimum especially when solving high-dimensional multimodal functions To this end this paperintroduces the mutation operator in the HGGWA algorithmto perform mutation operations on elite individuals in thepopulation The specific operations are as follows

Assume that the optimal individual is 119909119894(1199091 1199092 119909119889)and the mutation operation is performed on 119909119894 with themutation probability 119875119898 That is select a gene 119909119896 from theoptimal individual with probability 119875119898 instead of the gene 119909119896with a random number between upper and lower bounds to

8 Complexity

(1) Initialize parameters a A C population size N 119875119888 and 119875119898(2) Initialize population 119875 using OBL the strategy(3) t=0(4) While tltMa119909119894119905119890119903(5) Calculate the fitness of all search agents(6) 119883120572= the best search agent(7) 119883120573= the second best search agent(8) 119883120575= the third best search agent(9) for i = 1N(10) Update the position of all search agents by equation (11)(11) end for(12) newP1larr997888P except the119883120572(13) for i = 1N-1(14) Generate newP2 by the Roulette Wheel Selection on newP1(15) end for(16) Plarr997888newP2 and119883120572(17) Generate subPs by the population partitioning mechanism on P(18) Select the individuals by crossover probability 119875119888(19) for i = 1Nlowast119875119888(20) Crossover each search agent by equation (17) and (18)(21) end for(22) Generate the1198831015840120572 1198831015840120573 and1198831015840120575 by equation (19) on mutation probability 119875119898(23) t = t+1(24) end while(25) Return 119883120572Algorithm 1 Pseudo code of the HGGWA

generate a new individual 1199091015840119894 = (11990910158401 11990910158402 1199091015840119889) The specificoperation is as follows

1199091015840119894 = 119897 + 120582 lowast (119906 minus 119897) 119894 = 119896119909119894 119894 = 119896 (19)

where 120582 is a random number in [0 1] and 119897 and 119906 are thelower and upper bounds of the individual 119909119894 respectively46 Pseudo Code of the HGGWA and Time ComplexityAnalysis This section describes how to calculate an upperbound for the total number of fitness evaluations (FE)required by HGGWA As shown in Algorithm 1 the compu-tational complexity of HGGWA in one generation is mainlydominated by the position updating of all search agents inline (10) The position updating requires a time complexityof O(Nlowastdimlowast119872119886119909119894119905119890119903)(N is the population size dim is thedimensions of each benchmark function and 119872119886119909119894119905119890119903 isthe maximum number of iterations of HGGWA) to obtainthe needed parameters and to finish the whole positionupdating progress of all search agents In addition theprocess of crossover operations is another time-consumingstep in line (20) It needs a time complexity of O(2lowast120592 lowast120578 lowast 119872119886119909119894119905119890119903)(120592 lowast 120578 is the total number of subpopulations120592=N5 120578=dim5) to complete the crossing of individuals ineach subpopulations according to the crossover probability119875119888 It should be noted that this is only the worst casecomputational complexity If there are only two individualsin each subpopulation that need to be crossed then theminimum amount of computation is O(120592 lowast 120578 lowast 119872119886119909119894119905119890119903)

Therefore the maximum computational complexity causedby the twodominant processes in the algorithm ismax 119874(Nlowast119889119894119898lowast119872119886119909119894119905119890119903) 119874(2lowast120592lowast120578lowast119872119886119909119894119905119890119903) (120592 = N5 120578 = 1198891198941198985)in one generation ie 119874(N lowast 119889119894119898 lowast119872119886119909119894119905119890119903)5 Numerical Experiments and Analysis

In this section the proposed HGGWA algorithm will beevaluated on both classical benchmark functions [68] andthe suite of test functions provided by CEC2008 SpecialSession on large-scale global optimization The algorithmsused for comparison include not only conventional EAs butalso other CC optimization algorithms Experimental resultsare provided to analyze the performance of HGGWA in thecontext of large-scale optimization problems In addition thesensitivity analysis of nonlinear adjustment coefficient andthe global convergence of HGGWA are discussed in Sections54 and 55 respectively

51 Benchmark Functions and Performance Measures Inorder to verify the ability ofHGGWAalgorithm to solve high-dimensional complex functions 10 general high-dimensionalbenchmark functions (100D 500D 1000D) were selected foroptimization test Among them 1198911 sim 1198916 are unimodalfunctions which are used to test the local searchability ofthe algorithm 1198917 sim 11989110 are multimodal functions which areused to test the global searchability of the algorithm Thesetest functions have multiple local optima points with unevendistribution nonconvexity and strong oscillation which arevery difficult to converge to the global optimal solutionespecially in the case of being high-dimensional The specific

Complexity 9

Table 2 10 high-dimensional benchmark functions

Functions Function formula Range 119891119898119894119899 Accuracy

Sphere 1198911 (119909) = 119889sum119894=1

1199092119894 [minus100 100] 0 1 times 10minus8Schwefelrsquos 222 1198912 (119909) = 119889sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816 + 119889prod119894=1

10038161003816100381610038161199091198941003816100381610038161003816 [minus10 10] 0 1 times 10minus8Schwefelrsquos 221 1198913 (119909) = max11989410038161003816100381610038161199091198941003816100381610038161003816 1 le 119894 le 119889 [minus100 100] 0 1 times 10minus2Rosenbrock 1198914 (119909) = 119889minus1sum

119894=1

[100 (119909119894+1 minus 1199092119894 )2 + (119909119894 minus 1)2] [minus30 30] 0 1 times 100Schwefelrsquos 12 1198915 (119909) = 119889sum

119894=1

( 119894sum119895=1

119909119895)2 [minus100 100] 0 1 times 10minus3Quartic 1198916 (119909) = 119889sum

119894=1

1198941199094119894 + 119903119886119899119889119900119898[0 1) [minus128 128] 0 1 times 10minus4Rastrigin 1198917 (119909) = 119889sum

119894=1

[1199092119894 minus 10 cos (2120587119909119894) + 10] [minus512 512] 0 1 times 10minus8Ackley 1198918 = minus20 exp(minus02radic 1119889

119889sum119894=1

1199092119894) minus exp(1119889119889sum119894=1

cos (2120587119909119894)) + 20 + 119890 [minus32 32] 0 1 times 10minus5Griewank 1198919 = 14000

119889sum119894=1

1199092119894 minus 119889prod119894=1

cos( 119909119894radic119894) + 1 [minus600 600] 0 1 times 10minus5

Penalized 1

11989110 = 120587119889 10 sin (1205871199101) +119889minus1sum119894=1

(119910119894 minus 1)2 [1 + sin2 (120587119910119894+1)] + (119910119899 minus 1)2

[minus50 50] 0 1 times 10minus2+ 119889sum119894=1

119906 (119909119894 10 100 4)119910119894 = 1 + 119909119894 + 14

119906(119909119894 119886 119896119898)

119896(119909119894 minus 119886)119898 119909119894 gt 1198860119896(minus119909119894 minus 119886)119898 119909119894 lt minus119886

characteristics of the 10 benchmark functions are shown inTable 2

In order to show the optimization effect of the algorithmmore clearly this paper selects two indicators to evaluate theperformance of the algorithm [69] The first is the solutionaccuracy (AC) which reflects the difference between theoptimal results of the algorithm and the theoretical optimalvalue In an experiment if the final convergence result ofthe algorithm is less than the AC then the optimization isconsidered successful Assuming that the optimal value of acertain optimization is119883119900119901119905 and the theoretical optimal valueis119883119898119894119899 then the AC is calculated as follows

119860119862 = 10038161003816100381610038161003816119883119900119901119905 minus 11988311989811989411989910038161003816100381610038161003816 (20)

The second is the successful ratio (SR) which reflectsthe proportion of the number of successful optimizationsto the total number of optimizations under the conditionthat the algorithm has a certain accuracy Assume that inthe 119879 iterations of the algorithm the number of successfuloptimizations is 119905 then the SR is calculated as follows

119878119877 = 119905119879 times 100 (21)

52 Results on Classical Benchmark Functions In this sec-tion a set of simulation tests were performed in order toverify the effectiveness of HGGWA Firstly the 10 high-dimensional benchmark functions in Table 2 are optimizedbyHGGWAalgorithm proposed in this paper and the resultsare compared with WOA SSA and ALO In addition therunning time of several algorithms is compared under thegiven convergence accuracy

521 Parameters Settings for the Compared Algorithms In allexperiments the values of the common parameters such asthe maximum number of iterations (119872119886119909119894119905119890119903) the dimensionof the functions (D) and the population sizes (N) were cho-sen the same For all test problems we focus on investigatingthe optimization performance of the proposed method onproblemswithD= 100 500 and 1000Themaximumnumberof iterations is set to 1000 and the population size is set to50 The parameter setting of WOA SSA and ALO is derivedfrom the original literature [70ndash72] The optimal parametersettings of the HGGWA algorithm proposed in this paper areshown in Table 3

For each experiment of an algorithm on a benchmarkfunction 30 independent runs are performed to obtain a

10 Complexity

Table 3 Parameter settings of the HGGWA

Parameters Definition ValueN Population size 50119875119888 Cross probability 08119875119898 Mutation probability 001119872119886119909119894119905119890119903 Maximum number of iterations 1000119896 Nonlinear adjustment coefficient 05

fair comparison among the different algorithms The optimalvalue the worst value the average value and the standarddeviation of each algorithm optimization are recorded andthen the optimization successful ratio (SR) is calculatedAll programs were coded in Matlab 2017b (Win64) andexecuted on a Lenovo computer with Intel (R) Core I5-6300HQ 8G ROM 230GHz under Windows 10 operatingsystem

522 Comparison with State-of-the-Art Metaheuristic Algo-rithms In order to verify the efficiency of the HGGWAalgorithm several state-of-the-art metaheuristic algorithmsrecently proposed are compared with their optimizationresults These algorithms are Whale Optimization Algorithm(WOA) Salp Swarm Algorithm (SSA) and Ant Lion Opti-mization Algorithm (ALO) The test was performed usingthe same 10 high-dimensional functions in Table 2 and theresults of the comparison are shown in Table 4

From Table 4 it can be known that the convergenceaccuracy and optimization successful ratio of the HGGWAalgorithm are significantly higher than the other three algo-rithms for most of the test functions In the case of 100dimensions HGGWA algorithm can converge to the globaloptimal solution for the other nine test functions at one timein addition to function 3 And the algorithm has a successfulratio of 100 for the nine functions under the condition ofcertain convergence accuracy The convergence result of theWOA algorithm is better than the other three algorithms forfunctions 1198911 and 1198912 and its robustness is excellent With theincrease of function dimension the convergence precisionof several algorithms decreases slightly but the optimizationresults of HGGWA algorithm are better than those of WOASSA and ALO In general the HGGWA algorithm exhibitsbetter optimization effects than the WOA SSA and ALOalgorithms in the 100- 500- and 1000-dimensional testfunctions which proves the effectiveness of the HGGWAalgorithm for solving high-dimensional complex functions

Generally speaking the HGGWA algorithm performsbetter than other algorithms in most test functions Inorder to compare the convergence performance of the fouralgorithms more clearly the convergence curves of the fouralgorithms on Sphere Schwefelrsquos 222 Rastrigin GriewankQuartic and Ackley functions with D=100 are shown inFigure 2 Sphere and Schwefelrsquos 222 represent the two mostbasic unimodal functions Rastrigin and Griewank representthe two most basic multimodal functions It is seen thatHGGWA has higher precision and faster convergence speedthan other algorithms

523 Comparative Analysis of Running Time To evaluate theactual runtime of the several compared algorithms includingSSAWOAGWO andHGGWA their average running times(in seconds s) from 30 runs on function 1198911 1198912 1198916 1198917 1198918 1198919and 11989110 are plotted in Figure 3 The convergence accuracy ofeach test function has been described in Table 2

Figure 3 shows the convergence behavior of differentalgorithms Each point on the plot was calculated by takingthe average of 30 independent runs As can be seen fromFigure 3 several algorithms show different effects under thesame calculation accuracy For the functions 1198911 1198912 1198917 11989181198919 the GWO algorithm has the fastest convergence rateas a result of the advantages of the GWO algorithm itselfHowever for the test functions 1198916 and 11989110 the runningtime of the HGGWA algorithm is the shortest Throughfurther analysis it is known that HGGWA algorithm adds alittle computing time compared to GWO algorithm becauseof several improved strategies such as selection crossoverand mutation However the convergence speed of HGGWAalgorithm is still better than SSA and WOA algorithmsOverall the convergence speed of the HGGWA algorithmperforms better in several algorithms compared Moreoverthe running time of different test functions is very small andit is kept between 1s and 3s which shows that the HGGWAalgorithm has excellent stability

53 Results on CECrsquo2008 Benchmark Functions This sectionprovides an analysis of the effectiveness of HGGWA in termsof the CECrsquo2008 large-scale benchmark functions and acomparison with four powerful algorithms which have beenproven to be effective in solving large-scale optimizationproblems Experimental results are provided to analyze theperformance of HGGWA for large-scale optimization prob-lems

We tested our proposed algorithms with CECrsquo2008 func-tions for 100 500 and 1000 dimensions and the means ofthe best fitness value over 50 runs were recorded WhileF1 (Shifted Sphere) F4 (Shifted Rastrigin) and F6 (ShiftedAckley) are separable functions F2 (Schwefel Problem) F3(Shifted Rosenbrock) F5 (Shifted Griewank) and F7 (FastFractal) are nonseparable presenting a greater challenge toany algorithm that is prone to variable interactions Theperformance of HGGWA is compared with other algorithmsCCPSO2 [31] DEwSAcc [5] MLCC [38] and EPUS-PSO[49] for 100 500 and 1000 dimensions respectively Themaximum number of fitness evaluations (FEs) was calculatedby the following formula FEs = 5000timesD where D is thenumber of dimensions In order to reflect the fairness of thecomparison results the optimization results of the other fouralgorithms are directly derived from the original literatureThe specific comparison results are shown in Table 5

Table 5 shows the experimental results and the entriesshown in bold are significantly better than other algorithmsA general trend that can be seen is that HGGWA algorithmhas a promising optimization effect for most of the 7 func-tions

Generally speaking HGGWAoutperforms CCPSO2 on 6out of 7 functions with 100 and 500 dimensions respectivelyAmong them the result of HGGWA is slightly worse than

Complexity 11

Table4Com

paris

onresults

of10

high

-dim

ensio

nalb

enchmarkfunctio

nsby

WOASSA

ALO

and

HGGWA

Functio

nDim

WOA

SSA

ALO

HGGWA

Mean

Std

SRMean

Std

SRMean

Std

SRMean

Std

SR

119891 1100

577

E-96

469

E-96

100

101E-02

124E

-02

0304

E-01

231E-01

0485E-53

241E-52

100

500

724E

-93

298

E-94

100

324E+

04218E+

040

734E

+04

132E

+04

0835E-25

886E-25

100

1000

205

E-89

316E-90

100

119E

+05

104E

+05

0248E+

05212E+

050

175E

-17

273E-17

100

119891 2100

189E

-99

233E-99

100

885E+

00237E+

000

429E+

02325E+

020

401E-33

434E-33

100

500

138E

-99

216E-99

100

337E+

02217E+

020

231E+

03203E+

030

958E

-18

236E-18

100

1000

180E

-99

137E

-98

100

845E+

02343E+

020

325E+

04413E+

040

250E-11

166E

-11

100

119891 3100

384E+

01217E+

000

222E+

0110

5E+0

10

247E+

0113

4E+0

10

455E+

01369E+

010

500

932E

+01

532E+

010

327E+

01233E+

010

392E+

01234E+

010

722E

+01

501E+

000

1000

986E

+01

324E+

010

331E+

0112

8E+0

10

493E+

01399E+

010

831E+

01436E+

000

119891 4100

967E

+01

398E+

010

900E

+02

516E+

020

140E

+03

103E

+03

0682

E+01

125E

-01

100

500

495E+

02314E+

010

718E

+06

146E

+06

0209E+

0713

7E+0

70

367

E+02

823E-01

01000

991E+0

2248E+

020

298E+

0713

5E+0

70

152E

+08

100E

+07

0964

E+02

374E+

010

119891 5100

783E

+05

236E+

050

217E+

0410

3E+0

40

360

E+04

216E+

040

345

E-05

277

E-05

100

500

198E

+07

132E

+07

0429E+

05316E+

050

118E

+06

103E

+06

0267

E-03

438

E-03

80

1000

668E+

07344

E+07

0248E+

0612

3E+0

60

372E+

06349E+

060

833E+

00512E+

000

119891 6100

265E-04

143E

-04

100

112E

+00

103E

+00

0828E-01

268E-01

0234

E-09

317E-10

100

500

394E-04

322E-04

60

497E+

01366

E+01

0113E

+02

101E+0

20

573E-05

433E-06

100

1000

216E-03

136E

-03

0393E+

02278E+

020

177E

+03

233E+

030

108E

-04

162E

-03

80

119891 7100

000

E+00

000

E+00

100

627E+

01436E+

010

354E+

02332E+

020

000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

204

E+03

197E

+03

0334E+

03456E+

030

000

E+00

000

E+00

100

1000

000

E+00

000

E+00

100

569E+

0316

6E+0

30

719E

+03

214E+

030

557E-11

315E-12

100

119891 8100

444

E-15

312E-15

100

515E+

00413E+

000

493E+

00349E+

000

215E-15

348

E-15

100

500

444

E-15

305

E-15

100

126E

+01

100E

+01

012

8E+0

1228E+

010

874E-12

539E-12

100

1000

888

E-16

288

E-15

100

128E

+01

144E

+01

014

6E+0

1203E+

010

159E

-07

163E

-07

100

119891 9100

000

E+00

000

E+00

100

152E

-01

244

E-01

0454E-01

316E-01

0000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

282E+

02213E+

020

392E+

0210

8E+0

20

184E

-16

844

E-17

100

1000

000

E+00

000

E+00

100

110E

+03

134E

+03

0361E+

03117E

+03

0231E-13

329E-14

100

119891 10100

816E-03

432E-03

100

110E

+01

219E+

010

174E

+01

133E

+01

0273E-07

311E-07

100

500

252E-02

199E

-02

80

516E+

01333E+

010

397E+

05296E+

050

812E-05

498

E-06

100

1000

143E

-02

103E

-02

80

100E

+05

213E+

040

696E+

0613

4E+0

60

633E-04

377

E-03

100

12 Complexity

WOASSAALOHGGWA

500 10000Number of iterations

10minus100

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(a) Sphere function

WOASSAALOHGGWA

500 10000Number of iterations

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(b) Schwefel 222 function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(c) Rastrigin function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(d) Griewank function

WOASSAALOHGGWA

200 400 600 800 10000Number of iterations

10minus5

100

105

Obj

ectiv

e fun

ctio

n va

lue

(e) Quartic function

WOASSAALOHGGWA

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

500 10000Number of iterations

(f) Ackley function

Figure 2 Convergence curves of WOA SSA ALO and HGGWA algorithms

Comparative analysis of running time

459

344

053

13

388

239

072

186

537

366

317

285

423

233

122

216

399

217

057

139

259

218

069

121

463

333352

195

0

1

2

3

4

5

6

Runn

ing

Tim

es (s

)

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

SSAWOAGWOHGGWA(a) Comparative analysis of running time

Comparative analysis of iterations

302

233

198221

334290

263237

723

614

491

407 398

258

365312 263

234201

175

214185

166149

186

85

298

132

SSAWOAGWOHGGWA

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

100

200

300

400

500

600

700

Num

ber o

f ite

ratio

ns

(b) Comparative analysis of iterations

Figure 3 The average running times and iterations of SSA WOA GWO and HGGWA on different test functions

Complexity 13

Table5Com

paris

onresults

of7CE

Crsquo200

8benchm

arkfunctio

nsby

fivea

lgorith

ms

Functio

nDim

CCP

SO2

DEw

SAcc

MLC

CEP

US-PS

OHGGWA

Mean

Std

Mean

Std

Mean

Std

Mean

Std

Mean

Std

F1100

773E

-14

323E-14

568E-14

000

E+00

682E-14

232E-14

747E

-01

170E

-01

306

E-17

293E-17

500

300

E-13

796E

-14

209E-09

461E-09

429E-13

331E-14

845E+

01640

E+00

275E-15

213E-15

1000

518E-13

961E-14

878E-03

527E-03

845E-13

500

E-14

553E+

02286E+

0110

6E-10

183E

-11

F2100

608E+

0078

3E+0

0825E+

00532E+

00252E+

01872E+

0018

6E+0

1226E+

00271E+

00239

E+00

500

579E+

01421E+

0175

7E+0

1304

E+00

666

E+01

569E+

00435E+

01551E-01

312E+

01491E+

001000

782E

+01

425E+

0196

0E+0

118

1E+0

010

8E+0

2475E+

00466

E+01

400

E-01

271E+

01566

E+00

F3100

423E+

02865E+

0214

5E+0

2584E+

0114

9E+0

2572E+

01499E+

03535E+

03981E+

01225E-01

500

724E

+02

154E

+02

181E+0

3274E+

0292

4E+0

217

2E+0

2577E+

04804

E+03

496

E+02

623E-01

1000

133E

+03

263E+

02914E

+03

126E

+03

179E

+03

158E

+02

837E+

0515

2E+0

5997

E+02

358

E+01

F4100

398E-02

199E

-01

437E+

0076

5E+0

0438E-13

921E-14

471E+

02594E+

01000

E+00

000

E+00

500

398E-02

199E

-01

364

E+02

523E+

0117

9E-11

631E-11

349E+

03112E

+02

000

E+00

000

E+00

1000

199E

-01

406

E-01

182E

+03

137E

+02

137E

-10

337E-10

758E

+03

151E+0

2493E-11

287

E-12

F5100

345E-03

488E-03

307E-14

786E

-15

341E-14

116E

-14

372E-01

560

E-02

000

E+00

000

E+00

500

118E

-03

461E-03

690E-04

241E-03

212E-13

247E-14

164E

+00

469E-02

216E-16

756E

-17

1000

118E

-03

327E-03

358E-03

573E-03

418E-13

278

E-14

589E+

00391E-01

549E-13

235E-14

F6100

144E

-13

306

E-14

113E

-13

153E

-14

111E-13

786E

-15

206

E+00

440

E-01

321E-15

248

E-15

500

534E-13

861E-14

480E-01

573E-01

534

E-13

701E-14

664

E+00

449E-01

674E-12

439E-12

1000

102E

-12

168E

-13

229E+

00298E-01

106E

-12

768E

-14

189E

+01

249E+

00239E-07

213E-07

F7100

-150E

+03

104E

+01

-137

E+03

246

E+01

-154E

+03

252E+

00-855E

+02

135E

+01

-700e+0

2664

E-01

500

-723E

+03

416E+

01-574

E+03

183E

+02

-743E

+03

803E+

00-351E+0

3210E+

01-883e+0

3354

E+00

1000

-143E

+04

827E+

01-105E

+04

418E+

02-147E

+04

151E+0

1-662E

+03

318E+

01-369E

+04

216E+

02

14 Complexity

CCPSO2 for function F7 in the case of 100-dimensionalthe two algorithms obtain similar optimization results forthe function F6 in the case of 500-dimensional In additionat first glance it might seem that HGGWA and MLCChave similar performance However the HGGWA achievesbetter convergence accuracy than MLCC under the samenumber of iterations In particular the HGGWA algorithmrsquosoptimization results are superior toDEwSAcc andEPUS-PSOin all dimensions The result of HGGWA scaled very wellfrom 100-D to 1000-D on F1 F4 F5 and F6 outperformingDEwSAcc and EPUS-PSO on all the cases

54 Sensitivity Analysis of Nonlinear Adjustment Coefficient 119896This section mainly discusses the effect of nonlinear adjust-ment coefficients on the convergence performance of thealgorithm In the HGGWA algorithm the role of parameter119886 is to balance global exploration capabilities and localexploitation capabilities In (12) the nonlinear adjustmentcoefficient 119896 is a key parameter and is mainly used to controlthe range of variation of the convergence factor Thereforewe selected four different values for numerical experimentsto analyze the effect of nonlinear adjustment coefficients onthe performance of the algorithm The four different valuesare 051152 and the results of the comparison are shown inTable 6 Among them black bold represents the best result ofsolving in several algorithms

In general the value of the nonlinear adjustment coef-ficient 119896 has no significant effect on the performance ofthe HGGWA algorithm On closer inspection one can seethat besides the function 1198913 the optimization performanceof the algorithm is optimal when the nonlinear adjustmentcoefficient 119896 is 05 But for function 1198913 119896 = 15 is the bestchoice For functions 1198917 and 1198919 the HGGWA algorithmobtains the optimal solution 0 in the case of several differentvalues of the coefficient 119896 Therefore for most test functionsthe optimal value of the nonlinear adjustment factor 119896 is 0555 Global Convergence Analysis of HGGWA In theHGGWA algorithm the following four improved strategiesare used to ensure the global convergence of the algorithm(1) initial population strategy based on Opposition-BasedLearning (2) selection operation based on optimal retention(3) crossover operation based on population partitioningmechanism and (4) mutation operation for elite individ-uals

The idea of solving the high-dimensional function opti-mization problem byHGGWAalgorithm is as follows Firstlyfor the initial population strategy the basic GWO generatesthe initial population in a random manner However it willhave a great impact on the search efficiency of the algo-rithm for high-dimensional function optimization problemsThe algorithm cannot effectively search the entire problemsolution space if the initial grey wolf population is clusteredin a small range Therefore the Opposition-Based Learningstrategy is used to initialize the population which can makethe initial population evenly distributed in the solution spacethus laying a good foundation for the global search of thealgorithm Secondly the optimal retention strategy is usedto preserve the optimal individual of the parent population

in the process of population evolution As a result the nextgeneration of population can evolve in the optimal directionIn addition the individuals with higher fitness are selectedby the gambling roulette operation which maintains thedominance relationship between individuals in the popula-tion This dominance relationship makes the algorithm withgood global convergence Thirdly the crossover operation iscompleted under the condition of population partitioningin order to achieve the purpose of dimension reductionand maintain the diversity of the population Finally thesearch direction of the grey wolf population is guided by themutation operation of the elite individuals which effectivelyprevents the algorithm from falling into the local optimalsolution All in all through the improvement of these fourdifferent strategies the HGGWA algorithm shows goodglobal convergence in solving high-dimensional functionoptimization problems

6 Conclusion

In order to overcome the shortcomings of GWO algorithmfor solving large-scale global optimization problems whichare easy to fall into local optimum this paper proposes aHybrid Genetic GreyWolf Algorithm (HGGWA) by integrat-ing three genetic operators into the algorithm The improvedalgorithm initializes the population based on theOpposition-Based Learning strategy for improving the search efficiencyof the algorithm The strategy of population division basedon cooperative coevolution reduces the scale of the problemand the mutation operation of the elite individuals effectivelyprevents the algorithm from falling into the local optimaTheperformance of HGGWA has been evaluated using 10 classi-cal benchmark functions and 7 CECrsquo2008 high-dimensionalfunctions Fromour experimental results several conclusionscan be drawn

The results have shown that it is capable of groupinginteracting variables with great accuracy for the major-ity of the benchmark functions A comparative studyamong the HGGWA and other state-of-the-art algorithmswas conducted and the experimental results showed thatthe HGGWA algorithm exhibits better global convergencewhether it is solving separable functions or nonseparablefunctions(1)By testing 10 classical benchmark functions comparedwith the results of WOA SSA and ALO the HGGWA algo-rithm has been greatly improved in convergence accuracy Inaddition to the Schwefel problem and Rosenbrock functionthe HGGWA algorithm achieves an 80-100 successfulratio for all benchmark functions with 100 dimensionswhich proves the effectiveness of HGGWA for solving high-dimensional functions(2) By comparing with the optimization results on sevenCECrsquo2008 large-scale global optimization problems theresults show that HGGWA algorithm achieves the globaloptimal value for the separable functions F1 F4 and F6but the effect on other nonseparable problems is not verysatisfactory However HGGWAalgorithm still exhibits betterglobal convergence than the other algorithms CCPSO2DewSAcc MLCC and EPUS-PSO on most of the functions

Complexity 15

Table6Com

paris

onof

optim

izationperfo

rmance

ofdifferent119896val

uesfor

HGGWA

Functio

n119896=0

5119896=1

119896=15

119896=2Mean

Std

Mean

Std

Mean

Std

Mean

Std

119891 1843

E-60

536E-60

225E-54

203E-54

558E-53

431E-53

231E-51

316E-51

119891 2289

E-34

231E-34

206

E-33

189E

-33

470E-32

365E-32

576E-30

422E-30

119891 3111E+0

010

2E+0

0314E+

01219E+

01403E-01

238E-01

360

E+01

216E+

01119891 4

978

E+01

836E+

0198

2E+0

174

3E+0

198

1E+0

173

4E+0

198

3E+0

1461E+

01119891 5

218E-03

134E

-03

887E-02

667E-02

781E+0

0613E+

00174E

-02

234E-02

119891 613

1E-03

113E

-03

162E

-03

243E-04

299E-03

104E

-03

184E

-03

134E

-03

119891 7000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 815

8E-14

123E

-15

293E-14

137E

-14

222E-14

149E

-14

251E-14

108E

-14

119891 9000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 10299

E-02

249E-02

338E-02

124E

-03

842E-02

614E-02

729E

-02

813E-03

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 8: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

8 Complexity

(1) Initialize parameters a A C population size N 119875119888 and 119875119898(2) Initialize population 119875 using OBL the strategy(3) t=0(4) While tltMa119909119894119905119890119903(5) Calculate the fitness of all search agents(6) 119883120572= the best search agent(7) 119883120573= the second best search agent(8) 119883120575= the third best search agent(9) for i = 1N(10) Update the position of all search agents by equation (11)(11) end for(12) newP1larr997888P except the119883120572(13) for i = 1N-1(14) Generate newP2 by the Roulette Wheel Selection on newP1(15) end for(16) Plarr997888newP2 and119883120572(17) Generate subPs by the population partitioning mechanism on P(18) Select the individuals by crossover probability 119875119888(19) for i = 1Nlowast119875119888(20) Crossover each search agent by equation (17) and (18)(21) end for(22) Generate the1198831015840120572 1198831015840120573 and1198831015840120575 by equation (19) on mutation probability 119875119898(23) t = t+1(24) end while(25) Return 119883120572Algorithm 1 Pseudo code of the HGGWA

generate a new individual 1199091015840119894 = (11990910158401 11990910158402 1199091015840119889) The specificoperation is as follows

1199091015840119894 = 119897 + 120582 lowast (119906 minus 119897) 119894 = 119896119909119894 119894 = 119896 (19)

where 120582 is a random number in [0 1] and 119897 and 119906 are thelower and upper bounds of the individual 119909119894 respectively46 Pseudo Code of the HGGWA and Time ComplexityAnalysis This section describes how to calculate an upperbound for the total number of fitness evaluations (FE)required by HGGWA As shown in Algorithm 1 the compu-tational complexity of HGGWA in one generation is mainlydominated by the position updating of all search agents inline (10) The position updating requires a time complexityof O(Nlowastdimlowast119872119886119909119894119905119890119903)(N is the population size dim is thedimensions of each benchmark function and 119872119886119909119894119905119890119903 isthe maximum number of iterations of HGGWA) to obtainthe needed parameters and to finish the whole positionupdating progress of all search agents In addition theprocess of crossover operations is another time-consumingstep in line (20) It needs a time complexity of O(2lowast120592 lowast120578 lowast 119872119886119909119894119905119890119903)(120592 lowast 120578 is the total number of subpopulations120592=N5 120578=dim5) to complete the crossing of individuals ineach subpopulations according to the crossover probability119875119888 It should be noted that this is only the worst casecomputational complexity If there are only two individualsin each subpopulation that need to be crossed then theminimum amount of computation is O(120592 lowast 120578 lowast 119872119886119909119894119905119890119903)

Therefore the maximum computational complexity causedby the twodominant processes in the algorithm ismax 119874(Nlowast119889119894119898lowast119872119886119909119894119905119890119903) 119874(2lowast120592lowast120578lowast119872119886119909119894119905119890119903) (120592 = N5 120578 = 1198891198941198985)in one generation ie 119874(N lowast 119889119894119898 lowast119872119886119909119894119905119890119903)5 Numerical Experiments and Analysis

In this section the proposed HGGWA algorithm will beevaluated on both classical benchmark functions [68] andthe suite of test functions provided by CEC2008 SpecialSession on large-scale global optimization The algorithmsused for comparison include not only conventional EAs butalso other CC optimization algorithms Experimental resultsare provided to analyze the performance of HGGWA in thecontext of large-scale optimization problems In addition thesensitivity analysis of nonlinear adjustment coefficient andthe global convergence of HGGWA are discussed in Sections54 and 55 respectively

51 Benchmark Functions and Performance Measures Inorder to verify the ability ofHGGWAalgorithm to solve high-dimensional complex functions 10 general high-dimensionalbenchmark functions (100D 500D 1000D) were selected foroptimization test Among them 1198911 sim 1198916 are unimodalfunctions which are used to test the local searchability ofthe algorithm 1198917 sim 11989110 are multimodal functions which areused to test the global searchability of the algorithm Thesetest functions have multiple local optima points with unevendistribution nonconvexity and strong oscillation which arevery difficult to converge to the global optimal solutionespecially in the case of being high-dimensional The specific

Complexity 9

Table 2 10 high-dimensional benchmark functions

Functions Function formula Range 119891119898119894119899 Accuracy

Sphere 1198911 (119909) = 119889sum119894=1

1199092119894 [minus100 100] 0 1 times 10minus8Schwefelrsquos 222 1198912 (119909) = 119889sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816 + 119889prod119894=1

10038161003816100381610038161199091198941003816100381610038161003816 [minus10 10] 0 1 times 10minus8Schwefelrsquos 221 1198913 (119909) = max11989410038161003816100381610038161199091198941003816100381610038161003816 1 le 119894 le 119889 [minus100 100] 0 1 times 10minus2Rosenbrock 1198914 (119909) = 119889minus1sum

119894=1

[100 (119909119894+1 minus 1199092119894 )2 + (119909119894 minus 1)2] [minus30 30] 0 1 times 100Schwefelrsquos 12 1198915 (119909) = 119889sum

119894=1

( 119894sum119895=1

119909119895)2 [minus100 100] 0 1 times 10minus3Quartic 1198916 (119909) = 119889sum

119894=1

1198941199094119894 + 119903119886119899119889119900119898[0 1) [minus128 128] 0 1 times 10minus4Rastrigin 1198917 (119909) = 119889sum

119894=1

[1199092119894 minus 10 cos (2120587119909119894) + 10] [minus512 512] 0 1 times 10minus8Ackley 1198918 = minus20 exp(minus02radic 1119889

119889sum119894=1

1199092119894) minus exp(1119889119889sum119894=1

cos (2120587119909119894)) + 20 + 119890 [minus32 32] 0 1 times 10minus5Griewank 1198919 = 14000

119889sum119894=1

1199092119894 minus 119889prod119894=1

cos( 119909119894radic119894) + 1 [minus600 600] 0 1 times 10minus5

Penalized 1

11989110 = 120587119889 10 sin (1205871199101) +119889minus1sum119894=1

(119910119894 minus 1)2 [1 + sin2 (120587119910119894+1)] + (119910119899 minus 1)2

[minus50 50] 0 1 times 10minus2+ 119889sum119894=1

119906 (119909119894 10 100 4)119910119894 = 1 + 119909119894 + 14

119906(119909119894 119886 119896119898)

119896(119909119894 minus 119886)119898 119909119894 gt 1198860119896(minus119909119894 minus 119886)119898 119909119894 lt minus119886

characteristics of the 10 benchmark functions are shown inTable 2

In order to show the optimization effect of the algorithmmore clearly this paper selects two indicators to evaluate theperformance of the algorithm [69] The first is the solutionaccuracy (AC) which reflects the difference between theoptimal results of the algorithm and the theoretical optimalvalue In an experiment if the final convergence result ofthe algorithm is less than the AC then the optimization isconsidered successful Assuming that the optimal value of acertain optimization is119883119900119901119905 and the theoretical optimal valueis119883119898119894119899 then the AC is calculated as follows

119860119862 = 10038161003816100381610038161003816119883119900119901119905 minus 11988311989811989411989910038161003816100381610038161003816 (20)

The second is the successful ratio (SR) which reflectsthe proportion of the number of successful optimizationsto the total number of optimizations under the conditionthat the algorithm has a certain accuracy Assume that inthe 119879 iterations of the algorithm the number of successfuloptimizations is 119905 then the SR is calculated as follows

119878119877 = 119905119879 times 100 (21)

52 Results on Classical Benchmark Functions In this sec-tion a set of simulation tests were performed in order toverify the effectiveness of HGGWA Firstly the 10 high-dimensional benchmark functions in Table 2 are optimizedbyHGGWAalgorithm proposed in this paper and the resultsare compared with WOA SSA and ALO In addition therunning time of several algorithms is compared under thegiven convergence accuracy

521 Parameters Settings for the Compared Algorithms In allexperiments the values of the common parameters such asthe maximum number of iterations (119872119886119909119894119905119890119903) the dimensionof the functions (D) and the population sizes (N) were cho-sen the same For all test problems we focus on investigatingthe optimization performance of the proposed method onproblemswithD= 100 500 and 1000Themaximumnumberof iterations is set to 1000 and the population size is set to50 The parameter setting of WOA SSA and ALO is derivedfrom the original literature [70ndash72] The optimal parametersettings of the HGGWA algorithm proposed in this paper areshown in Table 3

For each experiment of an algorithm on a benchmarkfunction 30 independent runs are performed to obtain a

10 Complexity

Table 3 Parameter settings of the HGGWA

Parameters Definition ValueN Population size 50119875119888 Cross probability 08119875119898 Mutation probability 001119872119886119909119894119905119890119903 Maximum number of iterations 1000119896 Nonlinear adjustment coefficient 05

fair comparison among the different algorithms The optimalvalue the worst value the average value and the standarddeviation of each algorithm optimization are recorded andthen the optimization successful ratio (SR) is calculatedAll programs were coded in Matlab 2017b (Win64) andexecuted on a Lenovo computer with Intel (R) Core I5-6300HQ 8G ROM 230GHz under Windows 10 operatingsystem

522 Comparison with State-of-the-Art Metaheuristic Algo-rithms In order to verify the efficiency of the HGGWAalgorithm several state-of-the-art metaheuristic algorithmsrecently proposed are compared with their optimizationresults These algorithms are Whale Optimization Algorithm(WOA) Salp Swarm Algorithm (SSA) and Ant Lion Opti-mization Algorithm (ALO) The test was performed usingthe same 10 high-dimensional functions in Table 2 and theresults of the comparison are shown in Table 4

From Table 4 it can be known that the convergenceaccuracy and optimization successful ratio of the HGGWAalgorithm are significantly higher than the other three algo-rithms for most of the test functions In the case of 100dimensions HGGWA algorithm can converge to the globaloptimal solution for the other nine test functions at one timein addition to function 3 And the algorithm has a successfulratio of 100 for the nine functions under the condition ofcertain convergence accuracy The convergence result of theWOA algorithm is better than the other three algorithms forfunctions 1198911 and 1198912 and its robustness is excellent With theincrease of function dimension the convergence precisionof several algorithms decreases slightly but the optimizationresults of HGGWA algorithm are better than those of WOASSA and ALO In general the HGGWA algorithm exhibitsbetter optimization effects than the WOA SSA and ALOalgorithms in the 100- 500- and 1000-dimensional testfunctions which proves the effectiveness of the HGGWAalgorithm for solving high-dimensional complex functions

Generally speaking the HGGWA algorithm performsbetter than other algorithms in most test functions Inorder to compare the convergence performance of the fouralgorithms more clearly the convergence curves of the fouralgorithms on Sphere Schwefelrsquos 222 Rastrigin GriewankQuartic and Ackley functions with D=100 are shown inFigure 2 Sphere and Schwefelrsquos 222 represent the two mostbasic unimodal functions Rastrigin and Griewank representthe two most basic multimodal functions It is seen thatHGGWA has higher precision and faster convergence speedthan other algorithms

523 Comparative Analysis of Running Time To evaluate theactual runtime of the several compared algorithms includingSSAWOAGWO andHGGWA their average running times(in seconds s) from 30 runs on function 1198911 1198912 1198916 1198917 1198918 1198919and 11989110 are plotted in Figure 3 The convergence accuracy ofeach test function has been described in Table 2

Figure 3 shows the convergence behavior of differentalgorithms Each point on the plot was calculated by takingthe average of 30 independent runs As can be seen fromFigure 3 several algorithms show different effects under thesame calculation accuracy For the functions 1198911 1198912 1198917 11989181198919 the GWO algorithm has the fastest convergence rateas a result of the advantages of the GWO algorithm itselfHowever for the test functions 1198916 and 11989110 the runningtime of the HGGWA algorithm is the shortest Throughfurther analysis it is known that HGGWA algorithm adds alittle computing time compared to GWO algorithm becauseof several improved strategies such as selection crossoverand mutation However the convergence speed of HGGWAalgorithm is still better than SSA and WOA algorithmsOverall the convergence speed of the HGGWA algorithmperforms better in several algorithms compared Moreoverthe running time of different test functions is very small andit is kept between 1s and 3s which shows that the HGGWAalgorithm has excellent stability

53 Results on CECrsquo2008 Benchmark Functions This sectionprovides an analysis of the effectiveness of HGGWA in termsof the CECrsquo2008 large-scale benchmark functions and acomparison with four powerful algorithms which have beenproven to be effective in solving large-scale optimizationproblems Experimental results are provided to analyze theperformance of HGGWA for large-scale optimization prob-lems

We tested our proposed algorithms with CECrsquo2008 func-tions for 100 500 and 1000 dimensions and the means ofthe best fitness value over 50 runs were recorded WhileF1 (Shifted Sphere) F4 (Shifted Rastrigin) and F6 (ShiftedAckley) are separable functions F2 (Schwefel Problem) F3(Shifted Rosenbrock) F5 (Shifted Griewank) and F7 (FastFractal) are nonseparable presenting a greater challenge toany algorithm that is prone to variable interactions Theperformance of HGGWA is compared with other algorithmsCCPSO2 [31] DEwSAcc [5] MLCC [38] and EPUS-PSO[49] for 100 500 and 1000 dimensions respectively Themaximum number of fitness evaluations (FEs) was calculatedby the following formula FEs = 5000timesD where D is thenumber of dimensions In order to reflect the fairness of thecomparison results the optimization results of the other fouralgorithms are directly derived from the original literatureThe specific comparison results are shown in Table 5

Table 5 shows the experimental results and the entriesshown in bold are significantly better than other algorithmsA general trend that can be seen is that HGGWA algorithmhas a promising optimization effect for most of the 7 func-tions

Generally speaking HGGWAoutperforms CCPSO2 on 6out of 7 functions with 100 and 500 dimensions respectivelyAmong them the result of HGGWA is slightly worse than

Complexity 11

Table4Com

paris

onresults

of10

high

-dim

ensio

nalb

enchmarkfunctio

nsby

WOASSA

ALO

and

HGGWA

Functio

nDim

WOA

SSA

ALO

HGGWA

Mean

Std

SRMean

Std

SRMean

Std

SRMean

Std

SR

119891 1100

577

E-96

469

E-96

100

101E-02

124E

-02

0304

E-01

231E-01

0485E-53

241E-52

100

500

724E

-93

298

E-94

100

324E+

04218E+

040

734E

+04

132E

+04

0835E-25

886E-25

100

1000

205

E-89

316E-90

100

119E

+05

104E

+05

0248E+

05212E+

050

175E

-17

273E-17

100

119891 2100

189E

-99

233E-99

100

885E+

00237E+

000

429E+

02325E+

020

401E-33

434E-33

100

500

138E

-99

216E-99

100

337E+

02217E+

020

231E+

03203E+

030

958E

-18

236E-18

100

1000

180E

-99

137E

-98

100

845E+

02343E+

020

325E+

04413E+

040

250E-11

166E

-11

100

119891 3100

384E+

01217E+

000

222E+

0110

5E+0

10

247E+

0113

4E+0

10

455E+

01369E+

010

500

932E

+01

532E+

010

327E+

01233E+

010

392E+

01234E+

010

722E

+01

501E+

000

1000

986E

+01

324E+

010

331E+

0112

8E+0

10

493E+

01399E+

010

831E+

01436E+

000

119891 4100

967E

+01

398E+

010

900E

+02

516E+

020

140E

+03

103E

+03

0682

E+01

125E

-01

100

500

495E+

02314E+

010

718E

+06

146E

+06

0209E+

0713

7E+0

70

367

E+02

823E-01

01000

991E+0

2248E+

020

298E+

0713

5E+0

70

152E

+08

100E

+07

0964

E+02

374E+

010

119891 5100

783E

+05

236E+

050

217E+

0410

3E+0

40

360

E+04

216E+

040

345

E-05

277

E-05

100

500

198E

+07

132E

+07

0429E+

05316E+

050

118E

+06

103E

+06

0267

E-03

438

E-03

80

1000

668E+

07344

E+07

0248E+

0612

3E+0

60

372E+

06349E+

060

833E+

00512E+

000

119891 6100

265E-04

143E

-04

100

112E

+00

103E

+00

0828E-01

268E-01

0234

E-09

317E-10

100

500

394E-04

322E-04

60

497E+

01366

E+01

0113E

+02

101E+0

20

573E-05

433E-06

100

1000

216E-03

136E

-03

0393E+

02278E+

020

177E

+03

233E+

030

108E

-04

162E

-03

80

119891 7100

000

E+00

000

E+00

100

627E+

01436E+

010

354E+

02332E+

020

000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

204

E+03

197E

+03

0334E+

03456E+

030

000

E+00

000

E+00

100

1000

000

E+00

000

E+00

100

569E+

0316

6E+0

30

719E

+03

214E+

030

557E-11

315E-12

100

119891 8100

444

E-15

312E-15

100

515E+

00413E+

000

493E+

00349E+

000

215E-15

348

E-15

100

500

444

E-15

305

E-15

100

126E

+01

100E

+01

012

8E+0

1228E+

010

874E-12

539E-12

100

1000

888

E-16

288

E-15

100

128E

+01

144E

+01

014

6E+0

1203E+

010

159E

-07

163E

-07

100

119891 9100

000

E+00

000

E+00

100

152E

-01

244

E-01

0454E-01

316E-01

0000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

282E+

02213E+

020

392E+

0210

8E+0

20

184E

-16

844

E-17

100

1000

000

E+00

000

E+00

100

110E

+03

134E

+03

0361E+

03117E

+03

0231E-13

329E-14

100

119891 10100

816E-03

432E-03

100

110E

+01

219E+

010

174E

+01

133E

+01

0273E-07

311E-07

100

500

252E-02

199E

-02

80

516E+

01333E+

010

397E+

05296E+

050

812E-05

498

E-06

100

1000

143E

-02

103E

-02

80

100E

+05

213E+

040

696E+

0613

4E+0

60

633E-04

377

E-03

100

12 Complexity

WOASSAALOHGGWA

500 10000Number of iterations

10minus100

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(a) Sphere function

WOASSAALOHGGWA

500 10000Number of iterations

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(b) Schwefel 222 function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(c) Rastrigin function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(d) Griewank function

WOASSAALOHGGWA

200 400 600 800 10000Number of iterations

10minus5

100

105

Obj

ectiv

e fun

ctio

n va

lue

(e) Quartic function

WOASSAALOHGGWA

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

500 10000Number of iterations

(f) Ackley function

Figure 2 Convergence curves of WOA SSA ALO and HGGWA algorithms

Comparative analysis of running time

459

344

053

13

388

239

072

186

537

366

317

285

423

233

122

216

399

217

057

139

259

218

069

121

463

333352

195

0

1

2

3

4

5

6

Runn

ing

Tim

es (s

)

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

SSAWOAGWOHGGWA(a) Comparative analysis of running time

Comparative analysis of iterations

302

233

198221

334290

263237

723

614

491

407 398

258

365312 263

234201

175

214185

166149

186

85

298

132

SSAWOAGWOHGGWA

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

100

200

300

400

500

600

700

Num

ber o

f ite

ratio

ns

(b) Comparative analysis of iterations

Figure 3 The average running times and iterations of SSA WOA GWO and HGGWA on different test functions

Complexity 13

Table5Com

paris

onresults

of7CE

Crsquo200

8benchm

arkfunctio

nsby

fivea

lgorith

ms

Functio

nDim

CCP

SO2

DEw

SAcc

MLC

CEP

US-PS

OHGGWA

Mean

Std

Mean

Std

Mean

Std

Mean

Std

Mean

Std

F1100

773E

-14

323E-14

568E-14

000

E+00

682E-14

232E-14

747E

-01

170E

-01

306

E-17

293E-17

500

300

E-13

796E

-14

209E-09

461E-09

429E-13

331E-14

845E+

01640

E+00

275E-15

213E-15

1000

518E-13

961E-14

878E-03

527E-03

845E-13

500

E-14

553E+

02286E+

0110

6E-10

183E

-11

F2100

608E+

0078

3E+0

0825E+

00532E+

00252E+

01872E+

0018

6E+0

1226E+

00271E+

00239

E+00

500

579E+

01421E+

0175

7E+0

1304

E+00

666

E+01

569E+

00435E+

01551E-01

312E+

01491E+

001000

782E

+01

425E+

0196

0E+0

118

1E+0

010

8E+0

2475E+

00466

E+01

400

E-01

271E+

01566

E+00

F3100

423E+

02865E+

0214

5E+0

2584E+

0114

9E+0

2572E+

01499E+

03535E+

03981E+

01225E-01

500

724E

+02

154E

+02

181E+0

3274E+

0292

4E+0

217

2E+0

2577E+

04804

E+03

496

E+02

623E-01

1000

133E

+03

263E+

02914E

+03

126E

+03

179E

+03

158E

+02

837E+

0515

2E+0

5997

E+02

358

E+01

F4100

398E-02

199E

-01

437E+

0076

5E+0

0438E-13

921E-14

471E+

02594E+

01000

E+00

000

E+00

500

398E-02

199E

-01

364

E+02

523E+

0117

9E-11

631E-11

349E+

03112E

+02

000

E+00

000

E+00

1000

199E

-01

406

E-01

182E

+03

137E

+02

137E

-10

337E-10

758E

+03

151E+0

2493E-11

287

E-12

F5100

345E-03

488E-03

307E-14

786E

-15

341E-14

116E

-14

372E-01

560

E-02

000

E+00

000

E+00

500

118E

-03

461E-03

690E-04

241E-03

212E-13

247E-14

164E

+00

469E-02

216E-16

756E

-17

1000

118E

-03

327E-03

358E-03

573E-03

418E-13

278

E-14

589E+

00391E-01

549E-13

235E-14

F6100

144E

-13

306

E-14

113E

-13

153E

-14

111E-13

786E

-15

206

E+00

440

E-01

321E-15

248

E-15

500

534E-13

861E-14

480E-01

573E-01

534

E-13

701E-14

664

E+00

449E-01

674E-12

439E-12

1000

102E

-12

168E

-13

229E+

00298E-01

106E

-12

768E

-14

189E

+01

249E+

00239E-07

213E-07

F7100

-150E

+03

104E

+01

-137

E+03

246

E+01

-154E

+03

252E+

00-855E

+02

135E

+01

-700e+0

2664

E-01

500

-723E

+03

416E+

01-574

E+03

183E

+02

-743E

+03

803E+

00-351E+0

3210E+

01-883e+0

3354

E+00

1000

-143E

+04

827E+

01-105E

+04

418E+

02-147E

+04

151E+0

1-662E

+03

318E+

01-369E

+04

216E+

02

14 Complexity

CCPSO2 for function F7 in the case of 100-dimensionalthe two algorithms obtain similar optimization results forthe function F6 in the case of 500-dimensional In additionat first glance it might seem that HGGWA and MLCChave similar performance However the HGGWA achievesbetter convergence accuracy than MLCC under the samenumber of iterations In particular the HGGWA algorithmrsquosoptimization results are superior toDEwSAcc andEPUS-PSOin all dimensions The result of HGGWA scaled very wellfrom 100-D to 1000-D on F1 F4 F5 and F6 outperformingDEwSAcc and EPUS-PSO on all the cases

54 Sensitivity Analysis of Nonlinear Adjustment Coefficient 119896This section mainly discusses the effect of nonlinear adjust-ment coefficients on the convergence performance of thealgorithm In the HGGWA algorithm the role of parameter119886 is to balance global exploration capabilities and localexploitation capabilities In (12) the nonlinear adjustmentcoefficient 119896 is a key parameter and is mainly used to controlthe range of variation of the convergence factor Thereforewe selected four different values for numerical experimentsto analyze the effect of nonlinear adjustment coefficients onthe performance of the algorithm The four different valuesare 051152 and the results of the comparison are shown inTable 6 Among them black bold represents the best result ofsolving in several algorithms

In general the value of the nonlinear adjustment coef-ficient 119896 has no significant effect on the performance ofthe HGGWA algorithm On closer inspection one can seethat besides the function 1198913 the optimization performanceof the algorithm is optimal when the nonlinear adjustmentcoefficient 119896 is 05 But for function 1198913 119896 = 15 is the bestchoice For functions 1198917 and 1198919 the HGGWA algorithmobtains the optimal solution 0 in the case of several differentvalues of the coefficient 119896 Therefore for most test functionsthe optimal value of the nonlinear adjustment factor 119896 is 0555 Global Convergence Analysis of HGGWA In theHGGWA algorithm the following four improved strategiesare used to ensure the global convergence of the algorithm(1) initial population strategy based on Opposition-BasedLearning (2) selection operation based on optimal retention(3) crossover operation based on population partitioningmechanism and (4) mutation operation for elite individ-uals

The idea of solving the high-dimensional function opti-mization problem byHGGWAalgorithm is as follows Firstlyfor the initial population strategy the basic GWO generatesthe initial population in a random manner However it willhave a great impact on the search efficiency of the algo-rithm for high-dimensional function optimization problemsThe algorithm cannot effectively search the entire problemsolution space if the initial grey wolf population is clusteredin a small range Therefore the Opposition-Based Learningstrategy is used to initialize the population which can makethe initial population evenly distributed in the solution spacethus laying a good foundation for the global search of thealgorithm Secondly the optimal retention strategy is usedto preserve the optimal individual of the parent population

in the process of population evolution As a result the nextgeneration of population can evolve in the optimal directionIn addition the individuals with higher fitness are selectedby the gambling roulette operation which maintains thedominance relationship between individuals in the popula-tion This dominance relationship makes the algorithm withgood global convergence Thirdly the crossover operation iscompleted under the condition of population partitioningin order to achieve the purpose of dimension reductionand maintain the diversity of the population Finally thesearch direction of the grey wolf population is guided by themutation operation of the elite individuals which effectivelyprevents the algorithm from falling into the local optimalsolution All in all through the improvement of these fourdifferent strategies the HGGWA algorithm shows goodglobal convergence in solving high-dimensional functionoptimization problems

6 Conclusion

In order to overcome the shortcomings of GWO algorithmfor solving large-scale global optimization problems whichare easy to fall into local optimum this paper proposes aHybrid Genetic GreyWolf Algorithm (HGGWA) by integrat-ing three genetic operators into the algorithm The improvedalgorithm initializes the population based on theOpposition-Based Learning strategy for improving the search efficiencyof the algorithm The strategy of population division basedon cooperative coevolution reduces the scale of the problemand the mutation operation of the elite individuals effectivelyprevents the algorithm from falling into the local optimaTheperformance of HGGWA has been evaluated using 10 classi-cal benchmark functions and 7 CECrsquo2008 high-dimensionalfunctions Fromour experimental results several conclusionscan be drawn

The results have shown that it is capable of groupinginteracting variables with great accuracy for the major-ity of the benchmark functions A comparative studyamong the HGGWA and other state-of-the-art algorithmswas conducted and the experimental results showed thatthe HGGWA algorithm exhibits better global convergencewhether it is solving separable functions or nonseparablefunctions(1)By testing 10 classical benchmark functions comparedwith the results of WOA SSA and ALO the HGGWA algo-rithm has been greatly improved in convergence accuracy Inaddition to the Schwefel problem and Rosenbrock functionthe HGGWA algorithm achieves an 80-100 successfulratio for all benchmark functions with 100 dimensionswhich proves the effectiveness of HGGWA for solving high-dimensional functions(2) By comparing with the optimization results on sevenCECrsquo2008 large-scale global optimization problems theresults show that HGGWA algorithm achieves the globaloptimal value for the separable functions F1 F4 and F6but the effect on other nonseparable problems is not verysatisfactory However HGGWAalgorithm still exhibits betterglobal convergence than the other algorithms CCPSO2DewSAcc MLCC and EPUS-PSO on most of the functions

Complexity 15

Table6Com

paris

onof

optim

izationperfo

rmance

ofdifferent119896val

uesfor

HGGWA

Functio

n119896=0

5119896=1

119896=15

119896=2Mean

Std

Mean

Std

Mean

Std

Mean

Std

119891 1843

E-60

536E-60

225E-54

203E-54

558E-53

431E-53

231E-51

316E-51

119891 2289

E-34

231E-34

206

E-33

189E

-33

470E-32

365E-32

576E-30

422E-30

119891 3111E+0

010

2E+0

0314E+

01219E+

01403E-01

238E-01

360

E+01

216E+

01119891 4

978

E+01

836E+

0198

2E+0

174

3E+0

198

1E+0

173

4E+0

198

3E+0

1461E+

01119891 5

218E-03

134E

-03

887E-02

667E-02

781E+0

0613E+

00174E

-02

234E-02

119891 613

1E-03

113E

-03

162E

-03

243E-04

299E-03

104E

-03

184E

-03

134E

-03

119891 7000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 815

8E-14

123E

-15

293E-14

137E

-14

222E-14

149E

-14

251E-14

108E

-14

119891 9000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 10299

E-02

249E-02

338E-02

124E

-03

842E-02

614E-02

729E

-02

813E-03

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 9: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

Complexity 9

Table 2 10 high-dimensional benchmark functions

Functions Function formula Range 119891119898119894119899 Accuracy

Sphere 1198911 (119909) = 119889sum119894=1

1199092119894 [minus100 100] 0 1 times 10minus8Schwefelrsquos 222 1198912 (119909) = 119889sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816 + 119889prod119894=1

10038161003816100381610038161199091198941003816100381610038161003816 [minus10 10] 0 1 times 10minus8Schwefelrsquos 221 1198913 (119909) = max11989410038161003816100381610038161199091198941003816100381610038161003816 1 le 119894 le 119889 [minus100 100] 0 1 times 10minus2Rosenbrock 1198914 (119909) = 119889minus1sum

119894=1

[100 (119909119894+1 minus 1199092119894 )2 + (119909119894 minus 1)2] [minus30 30] 0 1 times 100Schwefelrsquos 12 1198915 (119909) = 119889sum

119894=1

( 119894sum119895=1

119909119895)2 [minus100 100] 0 1 times 10minus3Quartic 1198916 (119909) = 119889sum

119894=1

1198941199094119894 + 119903119886119899119889119900119898[0 1) [minus128 128] 0 1 times 10minus4Rastrigin 1198917 (119909) = 119889sum

119894=1

[1199092119894 minus 10 cos (2120587119909119894) + 10] [minus512 512] 0 1 times 10minus8Ackley 1198918 = minus20 exp(minus02radic 1119889

119889sum119894=1

1199092119894) minus exp(1119889119889sum119894=1

cos (2120587119909119894)) + 20 + 119890 [minus32 32] 0 1 times 10minus5Griewank 1198919 = 14000

119889sum119894=1

1199092119894 minus 119889prod119894=1

cos( 119909119894radic119894) + 1 [minus600 600] 0 1 times 10minus5

Penalized 1

11989110 = 120587119889 10 sin (1205871199101) +119889minus1sum119894=1

(119910119894 minus 1)2 [1 + sin2 (120587119910119894+1)] + (119910119899 minus 1)2

[minus50 50] 0 1 times 10minus2+ 119889sum119894=1

119906 (119909119894 10 100 4)119910119894 = 1 + 119909119894 + 14

119906(119909119894 119886 119896119898)

119896(119909119894 minus 119886)119898 119909119894 gt 1198860119896(minus119909119894 minus 119886)119898 119909119894 lt minus119886

characteristics of the 10 benchmark functions are shown inTable 2

In order to show the optimization effect of the algorithmmore clearly this paper selects two indicators to evaluate theperformance of the algorithm [69] The first is the solutionaccuracy (AC) which reflects the difference between theoptimal results of the algorithm and the theoretical optimalvalue In an experiment if the final convergence result ofthe algorithm is less than the AC then the optimization isconsidered successful Assuming that the optimal value of acertain optimization is119883119900119901119905 and the theoretical optimal valueis119883119898119894119899 then the AC is calculated as follows

119860119862 = 10038161003816100381610038161003816119883119900119901119905 minus 11988311989811989411989910038161003816100381610038161003816 (20)

The second is the successful ratio (SR) which reflectsthe proportion of the number of successful optimizationsto the total number of optimizations under the conditionthat the algorithm has a certain accuracy Assume that inthe 119879 iterations of the algorithm the number of successfuloptimizations is 119905 then the SR is calculated as follows

119878119877 = 119905119879 times 100 (21)

52 Results on Classical Benchmark Functions In this sec-tion a set of simulation tests were performed in order toverify the effectiveness of HGGWA Firstly the 10 high-dimensional benchmark functions in Table 2 are optimizedbyHGGWAalgorithm proposed in this paper and the resultsare compared with WOA SSA and ALO In addition therunning time of several algorithms is compared under thegiven convergence accuracy

521 Parameters Settings for the Compared Algorithms In allexperiments the values of the common parameters such asthe maximum number of iterations (119872119886119909119894119905119890119903) the dimensionof the functions (D) and the population sizes (N) were cho-sen the same For all test problems we focus on investigatingthe optimization performance of the proposed method onproblemswithD= 100 500 and 1000Themaximumnumberof iterations is set to 1000 and the population size is set to50 The parameter setting of WOA SSA and ALO is derivedfrom the original literature [70ndash72] The optimal parametersettings of the HGGWA algorithm proposed in this paper areshown in Table 3

For each experiment of an algorithm on a benchmarkfunction 30 independent runs are performed to obtain a

10 Complexity

Table 3 Parameter settings of the HGGWA

Parameters Definition ValueN Population size 50119875119888 Cross probability 08119875119898 Mutation probability 001119872119886119909119894119905119890119903 Maximum number of iterations 1000119896 Nonlinear adjustment coefficient 05

fair comparison among the different algorithms The optimalvalue the worst value the average value and the standarddeviation of each algorithm optimization are recorded andthen the optimization successful ratio (SR) is calculatedAll programs were coded in Matlab 2017b (Win64) andexecuted on a Lenovo computer with Intel (R) Core I5-6300HQ 8G ROM 230GHz under Windows 10 operatingsystem

522 Comparison with State-of-the-Art Metaheuristic Algo-rithms In order to verify the efficiency of the HGGWAalgorithm several state-of-the-art metaheuristic algorithmsrecently proposed are compared with their optimizationresults These algorithms are Whale Optimization Algorithm(WOA) Salp Swarm Algorithm (SSA) and Ant Lion Opti-mization Algorithm (ALO) The test was performed usingthe same 10 high-dimensional functions in Table 2 and theresults of the comparison are shown in Table 4

From Table 4 it can be known that the convergenceaccuracy and optimization successful ratio of the HGGWAalgorithm are significantly higher than the other three algo-rithms for most of the test functions In the case of 100dimensions HGGWA algorithm can converge to the globaloptimal solution for the other nine test functions at one timein addition to function 3 And the algorithm has a successfulratio of 100 for the nine functions under the condition ofcertain convergence accuracy The convergence result of theWOA algorithm is better than the other three algorithms forfunctions 1198911 and 1198912 and its robustness is excellent With theincrease of function dimension the convergence precisionof several algorithms decreases slightly but the optimizationresults of HGGWA algorithm are better than those of WOASSA and ALO In general the HGGWA algorithm exhibitsbetter optimization effects than the WOA SSA and ALOalgorithms in the 100- 500- and 1000-dimensional testfunctions which proves the effectiveness of the HGGWAalgorithm for solving high-dimensional complex functions

Generally speaking the HGGWA algorithm performsbetter than other algorithms in most test functions Inorder to compare the convergence performance of the fouralgorithms more clearly the convergence curves of the fouralgorithms on Sphere Schwefelrsquos 222 Rastrigin GriewankQuartic and Ackley functions with D=100 are shown inFigure 2 Sphere and Schwefelrsquos 222 represent the two mostbasic unimodal functions Rastrigin and Griewank representthe two most basic multimodal functions It is seen thatHGGWA has higher precision and faster convergence speedthan other algorithms

523 Comparative Analysis of Running Time To evaluate theactual runtime of the several compared algorithms includingSSAWOAGWO andHGGWA their average running times(in seconds s) from 30 runs on function 1198911 1198912 1198916 1198917 1198918 1198919and 11989110 are plotted in Figure 3 The convergence accuracy ofeach test function has been described in Table 2

Figure 3 shows the convergence behavior of differentalgorithms Each point on the plot was calculated by takingthe average of 30 independent runs As can be seen fromFigure 3 several algorithms show different effects under thesame calculation accuracy For the functions 1198911 1198912 1198917 11989181198919 the GWO algorithm has the fastest convergence rateas a result of the advantages of the GWO algorithm itselfHowever for the test functions 1198916 and 11989110 the runningtime of the HGGWA algorithm is the shortest Throughfurther analysis it is known that HGGWA algorithm adds alittle computing time compared to GWO algorithm becauseof several improved strategies such as selection crossoverand mutation However the convergence speed of HGGWAalgorithm is still better than SSA and WOA algorithmsOverall the convergence speed of the HGGWA algorithmperforms better in several algorithms compared Moreoverthe running time of different test functions is very small andit is kept between 1s and 3s which shows that the HGGWAalgorithm has excellent stability

53 Results on CECrsquo2008 Benchmark Functions This sectionprovides an analysis of the effectiveness of HGGWA in termsof the CECrsquo2008 large-scale benchmark functions and acomparison with four powerful algorithms which have beenproven to be effective in solving large-scale optimizationproblems Experimental results are provided to analyze theperformance of HGGWA for large-scale optimization prob-lems

We tested our proposed algorithms with CECrsquo2008 func-tions for 100 500 and 1000 dimensions and the means ofthe best fitness value over 50 runs were recorded WhileF1 (Shifted Sphere) F4 (Shifted Rastrigin) and F6 (ShiftedAckley) are separable functions F2 (Schwefel Problem) F3(Shifted Rosenbrock) F5 (Shifted Griewank) and F7 (FastFractal) are nonseparable presenting a greater challenge toany algorithm that is prone to variable interactions Theperformance of HGGWA is compared with other algorithmsCCPSO2 [31] DEwSAcc [5] MLCC [38] and EPUS-PSO[49] for 100 500 and 1000 dimensions respectively Themaximum number of fitness evaluations (FEs) was calculatedby the following formula FEs = 5000timesD where D is thenumber of dimensions In order to reflect the fairness of thecomparison results the optimization results of the other fouralgorithms are directly derived from the original literatureThe specific comparison results are shown in Table 5

Table 5 shows the experimental results and the entriesshown in bold are significantly better than other algorithmsA general trend that can be seen is that HGGWA algorithmhas a promising optimization effect for most of the 7 func-tions

Generally speaking HGGWAoutperforms CCPSO2 on 6out of 7 functions with 100 and 500 dimensions respectivelyAmong them the result of HGGWA is slightly worse than

Complexity 11

Table4Com

paris

onresults

of10

high

-dim

ensio

nalb

enchmarkfunctio

nsby

WOASSA

ALO

and

HGGWA

Functio

nDim

WOA

SSA

ALO

HGGWA

Mean

Std

SRMean

Std

SRMean

Std

SRMean

Std

SR

119891 1100

577

E-96

469

E-96

100

101E-02

124E

-02

0304

E-01

231E-01

0485E-53

241E-52

100

500

724E

-93

298

E-94

100

324E+

04218E+

040

734E

+04

132E

+04

0835E-25

886E-25

100

1000

205

E-89

316E-90

100

119E

+05

104E

+05

0248E+

05212E+

050

175E

-17

273E-17

100

119891 2100

189E

-99

233E-99

100

885E+

00237E+

000

429E+

02325E+

020

401E-33

434E-33

100

500

138E

-99

216E-99

100

337E+

02217E+

020

231E+

03203E+

030

958E

-18

236E-18

100

1000

180E

-99

137E

-98

100

845E+

02343E+

020

325E+

04413E+

040

250E-11

166E

-11

100

119891 3100

384E+

01217E+

000

222E+

0110

5E+0

10

247E+

0113

4E+0

10

455E+

01369E+

010

500

932E

+01

532E+

010

327E+

01233E+

010

392E+

01234E+

010

722E

+01

501E+

000

1000

986E

+01

324E+

010

331E+

0112

8E+0

10

493E+

01399E+

010

831E+

01436E+

000

119891 4100

967E

+01

398E+

010

900E

+02

516E+

020

140E

+03

103E

+03

0682

E+01

125E

-01

100

500

495E+

02314E+

010

718E

+06

146E

+06

0209E+

0713

7E+0

70

367

E+02

823E-01

01000

991E+0

2248E+

020

298E+

0713

5E+0

70

152E

+08

100E

+07

0964

E+02

374E+

010

119891 5100

783E

+05

236E+

050

217E+

0410

3E+0

40

360

E+04

216E+

040

345

E-05

277

E-05

100

500

198E

+07

132E

+07

0429E+

05316E+

050

118E

+06

103E

+06

0267

E-03

438

E-03

80

1000

668E+

07344

E+07

0248E+

0612

3E+0

60

372E+

06349E+

060

833E+

00512E+

000

119891 6100

265E-04

143E

-04

100

112E

+00

103E

+00

0828E-01

268E-01

0234

E-09

317E-10

100

500

394E-04

322E-04

60

497E+

01366

E+01

0113E

+02

101E+0

20

573E-05

433E-06

100

1000

216E-03

136E

-03

0393E+

02278E+

020

177E

+03

233E+

030

108E

-04

162E

-03

80

119891 7100

000

E+00

000

E+00

100

627E+

01436E+

010

354E+

02332E+

020

000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

204

E+03

197E

+03

0334E+

03456E+

030

000

E+00

000

E+00

100

1000

000

E+00

000

E+00

100

569E+

0316

6E+0

30

719E

+03

214E+

030

557E-11

315E-12

100

119891 8100

444

E-15

312E-15

100

515E+

00413E+

000

493E+

00349E+

000

215E-15

348

E-15

100

500

444

E-15

305

E-15

100

126E

+01

100E

+01

012

8E+0

1228E+

010

874E-12

539E-12

100

1000

888

E-16

288

E-15

100

128E

+01

144E

+01

014

6E+0

1203E+

010

159E

-07

163E

-07

100

119891 9100

000

E+00

000

E+00

100

152E

-01

244

E-01

0454E-01

316E-01

0000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

282E+

02213E+

020

392E+

0210

8E+0

20

184E

-16

844

E-17

100

1000

000

E+00

000

E+00

100

110E

+03

134E

+03

0361E+

03117E

+03

0231E-13

329E-14

100

119891 10100

816E-03

432E-03

100

110E

+01

219E+

010

174E

+01

133E

+01

0273E-07

311E-07

100

500

252E-02

199E

-02

80

516E+

01333E+

010

397E+

05296E+

050

812E-05

498

E-06

100

1000

143E

-02

103E

-02

80

100E

+05

213E+

040

696E+

0613

4E+0

60

633E-04

377

E-03

100

12 Complexity

WOASSAALOHGGWA

500 10000Number of iterations

10minus100

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(a) Sphere function

WOASSAALOHGGWA

500 10000Number of iterations

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(b) Schwefel 222 function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(c) Rastrigin function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(d) Griewank function

WOASSAALOHGGWA

200 400 600 800 10000Number of iterations

10minus5

100

105

Obj

ectiv

e fun

ctio

n va

lue

(e) Quartic function

WOASSAALOHGGWA

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

500 10000Number of iterations

(f) Ackley function

Figure 2 Convergence curves of WOA SSA ALO and HGGWA algorithms

Comparative analysis of running time

459

344

053

13

388

239

072

186

537

366

317

285

423

233

122

216

399

217

057

139

259

218

069

121

463

333352

195

0

1

2

3

4

5

6

Runn

ing

Tim

es (s

)

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

SSAWOAGWOHGGWA(a) Comparative analysis of running time

Comparative analysis of iterations

302

233

198221

334290

263237

723

614

491

407 398

258

365312 263

234201

175

214185

166149

186

85

298

132

SSAWOAGWOHGGWA

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

100

200

300

400

500

600

700

Num

ber o

f ite

ratio

ns

(b) Comparative analysis of iterations

Figure 3 The average running times and iterations of SSA WOA GWO and HGGWA on different test functions

Complexity 13

Table5Com

paris

onresults

of7CE

Crsquo200

8benchm

arkfunctio

nsby

fivea

lgorith

ms

Functio

nDim

CCP

SO2

DEw

SAcc

MLC

CEP

US-PS

OHGGWA

Mean

Std

Mean

Std

Mean

Std

Mean

Std

Mean

Std

F1100

773E

-14

323E-14

568E-14

000

E+00

682E-14

232E-14

747E

-01

170E

-01

306

E-17

293E-17

500

300

E-13

796E

-14

209E-09

461E-09

429E-13

331E-14

845E+

01640

E+00

275E-15

213E-15

1000

518E-13

961E-14

878E-03

527E-03

845E-13

500

E-14

553E+

02286E+

0110

6E-10

183E

-11

F2100

608E+

0078

3E+0

0825E+

00532E+

00252E+

01872E+

0018

6E+0

1226E+

00271E+

00239

E+00

500

579E+

01421E+

0175

7E+0

1304

E+00

666

E+01

569E+

00435E+

01551E-01

312E+

01491E+

001000

782E

+01

425E+

0196

0E+0

118

1E+0

010

8E+0

2475E+

00466

E+01

400

E-01

271E+

01566

E+00

F3100

423E+

02865E+

0214

5E+0

2584E+

0114

9E+0

2572E+

01499E+

03535E+

03981E+

01225E-01

500

724E

+02

154E

+02

181E+0

3274E+

0292

4E+0

217

2E+0

2577E+

04804

E+03

496

E+02

623E-01

1000

133E

+03

263E+

02914E

+03

126E

+03

179E

+03

158E

+02

837E+

0515

2E+0

5997

E+02

358

E+01

F4100

398E-02

199E

-01

437E+

0076

5E+0

0438E-13

921E-14

471E+

02594E+

01000

E+00

000

E+00

500

398E-02

199E

-01

364

E+02

523E+

0117

9E-11

631E-11

349E+

03112E

+02

000

E+00

000

E+00

1000

199E

-01

406

E-01

182E

+03

137E

+02

137E

-10

337E-10

758E

+03

151E+0

2493E-11

287

E-12

F5100

345E-03

488E-03

307E-14

786E

-15

341E-14

116E

-14

372E-01

560

E-02

000

E+00

000

E+00

500

118E

-03

461E-03

690E-04

241E-03

212E-13

247E-14

164E

+00

469E-02

216E-16

756E

-17

1000

118E

-03

327E-03

358E-03

573E-03

418E-13

278

E-14

589E+

00391E-01

549E-13

235E-14

F6100

144E

-13

306

E-14

113E

-13

153E

-14

111E-13

786E

-15

206

E+00

440

E-01

321E-15

248

E-15

500

534E-13

861E-14

480E-01

573E-01

534

E-13

701E-14

664

E+00

449E-01

674E-12

439E-12

1000

102E

-12

168E

-13

229E+

00298E-01

106E

-12

768E

-14

189E

+01

249E+

00239E-07

213E-07

F7100

-150E

+03

104E

+01

-137

E+03

246

E+01

-154E

+03

252E+

00-855E

+02

135E

+01

-700e+0

2664

E-01

500

-723E

+03

416E+

01-574

E+03

183E

+02

-743E

+03

803E+

00-351E+0

3210E+

01-883e+0

3354

E+00

1000

-143E

+04

827E+

01-105E

+04

418E+

02-147E

+04

151E+0

1-662E

+03

318E+

01-369E

+04

216E+

02

14 Complexity

CCPSO2 for function F7 in the case of 100-dimensionalthe two algorithms obtain similar optimization results forthe function F6 in the case of 500-dimensional In additionat first glance it might seem that HGGWA and MLCChave similar performance However the HGGWA achievesbetter convergence accuracy than MLCC under the samenumber of iterations In particular the HGGWA algorithmrsquosoptimization results are superior toDEwSAcc andEPUS-PSOin all dimensions The result of HGGWA scaled very wellfrom 100-D to 1000-D on F1 F4 F5 and F6 outperformingDEwSAcc and EPUS-PSO on all the cases

54 Sensitivity Analysis of Nonlinear Adjustment Coefficient 119896This section mainly discusses the effect of nonlinear adjust-ment coefficients on the convergence performance of thealgorithm In the HGGWA algorithm the role of parameter119886 is to balance global exploration capabilities and localexploitation capabilities In (12) the nonlinear adjustmentcoefficient 119896 is a key parameter and is mainly used to controlthe range of variation of the convergence factor Thereforewe selected four different values for numerical experimentsto analyze the effect of nonlinear adjustment coefficients onthe performance of the algorithm The four different valuesare 051152 and the results of the comparison are shown inTable 6 Among them black bold represents the best result ofsolving in several algorithms

In general the value of the nonlinear adjustment coef-ficient 119896 has no significant effect on the performance ofthe HGGWA algorithm On closer inspection one can seethat besides the function 1198913 the optimization performanceof the algorithm is optimal when the nonlinear adjustmentcoefficient 119896 is 05 But for function 1198913 119896 = 15 is the bestchoice For functions 1198917 and 1198919 the HGGWA algorithmobtains the optimal solution 0 in the case of several differentvalues of the coefficient 119896 Therefore for most test functionsthe optimal value of the nonlinear adjustment factor 119896 is 0555 Global Convergence Analysis of HGGWA In theHGGWA algorithm the following four improved strategiesare used to ensure the global convergence of the algorithm(1) initial population strategy based on Opposition-BasedLearning (2) selection operation based on optimal retention(3) crossover operation based on population partitioningmechanism and (4) mutation operation for elite individ-uals

The idea of solving the high-dimensional function opti-mization problem byHGGWAalgorithm is as follows Firstlyfor the initial population strategy the basic GWO generatesthe initial population in a random manner However it willhave a great impact on the search efficiency of the algo-rithm for high-dimensional function optimization problemsThe algorithm cannot effectively search the entire problemsolution space if the initial grey wolf population is clusteredin a small range Therefore the Opposition-Based Learningstrategy is used to initialize the population which can makethe initial population evenly distributed in the solution spacethus laying a good foundation for the global search of thealgorithm Secondly the optimal retention strategy is usedto preserve the optimal individual of the parent population

in the process of population evolution As a result the nextgeneration of population can evolve in the optimal directionIn addition the individuals with higher fitness are selectedby the gambling roulette operation which maintains thedominance relationship between individuals in the popula-tion This dominance relationship makes the algorithm withgood global convergence Thirdly the crossover operation iscompleted under the condition of population partitioningin order to achieve the purpose of dimension reductionand maintain the diversity of the population Finally thesearch direction of the grey wolf population is guided by themutation operation of the elite individuals which effectivelyprevents the algorithm from falling into the local optimalsolution All in all through the improvement of these fourdifferent strategies the HGGWA algorithm shows goodglobal convergence in solving high-dimensional functionoptimization problems

6 Conclusion

In order to overcome the shortcomings of GWO algorithmfor solving large-scale global optimization problems whichare easy to fall into local optimum this paper proposes aHybrid Genetic GreyWolf Algorithm (HGGWA) by integrat-ing three genetic operators into the algorithm The improvedalgorithm initializes the population based on theOpposition-Based Learning strategy for improving the search efficiencyof the algorithm The strategy of population division basedon cooperative coevolution reduces the scale of the problemand the mutation operation of the elite individuals effectivelyprevents the algorithm from falling into the local optimaTheperformance of HGGWA has been evaluated using 10 classi-cal benchmark functions and 7 CECrsquo2008 high-dimensionalfunctions Fromour experimental results several conclusionscan be drawn

The results have shown that it is capable of groupinginteracting variables with great accuracy for the major-ity of the benchmark functions A comparative studyamong the HGGWA and other state-of-the-art algorithmswas conducted and the experimental results showed thatthe HGGWA algorithm exhibits better global convergencewhether it is solving separable functions or nonseparablefunctions(1)By testing 10 classical benchmark functions comparedwith the results of WOA SSA and ALO the HGGWA algo-rithm has been greatly improved in convergence accuracy Inaddition to the Schwefel problem and Rosenbrock functionthe HGGWA algorithm achieves an 80-100 successfulratio for all benchmark functions with 100 dimensionswhich proves the effectiveness of HGGWA for solving high-dimensional functions(2) By comparing with the optimization results on sevenCECrsquo2008 large-scale global optimization problems theresults show that HGGWA algorithm achieves the globaloptimal value for the separable functions F1 F4 and F6but the effect on other nonseparable problems is not verysatisfactory However HGGWAalgorithm still exhibits betterglobal convergence than the other algorithms CCPSO2DewSAcc MLCC and EPUS-PSO on most of the functions

Complexity 15

Table6Com

paris

onof

optim

izationperfo

rmance

ofdifferent119896val

uesfor

HGGWA

Functio

n119896=0

5119896=1

119896=15

119896=2Mean

Std

Mean

Std

Mean

Std

Mean

Std

119891 1843

E-60

536E-60

225E-54

203E-54

558E-53

431E-53

231E-51

316E-51

119891 2289

E-34

231E-34

206

E-33

189E

-33

470E-32

365E-32

576E-30

422E-30

119891 3111E+0

010

2E+0

0314E+

01219E+

01403E-01

238E-01

360

E+01

216E+

01119891 4

978

E+01

836E+

0198

2E+0

174

3E+0

198

1E+0

173

4E+0

198

3E+0

1461E+

01119891 5

218E-03

134E

-03

887E-02

667E-02

781E+0

0613E+

00174E

-02

234E-02

119891 613

1E-03

113E

-03

162E

-03

243E-04

299E-03

104E

-03

184E

-03

134E

-03

119891 7000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 815

8E-14

123E

-15

293E-14

137E

-14

222E-14

149E

-14

251E-14

108E

-14

119891 9000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 10299

E-02

249E-02

338E-02

124E

-03

842E-02

614E-02

729E

-02

813E-03

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 10: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

10 Complexity

Table 3 Parameter settings of the HGGWA

Parameters Definition ValueN Population size 50119875119888 Cross probability 08119875119898 Mutation probability 001119872119886119909119894119905119890119903 Maximum number of iterations 1000119896 Nonlinear adjustment coefficient 05

fair comparison among the different algorithms The optimalvalue the worst value the average value and the standarddeviation of each algorithm optimization are recorded andthen the optimization successful ratio (SR) is calculatedAll programs were coded in Matlab 2017b (Win64) andexecuted on a Lenovo computer with Intel (R) Core I5-6300HQ 8G ROM 230GHz under Windows 10 operatingsystem

522 Comparison with State-of-the-Art Metaheuristic Algo-rithms In order to verify the efficiency of the HGGWAalgorithm several state-of-the-art metaheuristic algorithmsrecently proposed are compared with their optimizationresults These algorithms are Whale Optimization Algorithm(WOA) Salp Swarm Algorithm (SSA) and Ant Lion Opti-mization Algorithm (ALO) The test was performed usingthe same 10 high-dimensional functions in Table 2 and theresults of the comparison are shown in Table 4

From Table 4 it can be known that the convergenceaccuracy and optimization successful ratio of the HGGWAalgorithm are significantly higher than the other three algo-rithms for most of the test functions In the case of 100dimensions HGGWA algorithm can converge to the globaloptimal solution for the other nine test functions at one timein addition to function 3 And the algorithm has a successfulratio of 100 for the nine functions under the condition ofcertain convergence accuracy The convergence result of theWOA algorithm is better than the other three algorithms forfunctions 1198911 and 1198912 and its robustness is excellent With theincrease of function dimension the convergence precisionof several algorithms decreases slightly but the optimizationresults of HGGWA algorithm are better than those of WOASSA and ALO In general the HGGWA algorithm exhibitsbetter optimization effects than the WOA SSA and ALOalgorithms in the 100- 500- and 1000-dimensional testfunctions which proves the effectiveness of the HGGWAalgorithm for solving high-dimensional complex functions

Generally speaking the HGGWA algorithm performsbetter than other algorithms in most test functions Inorder to compare the convergence performance of the fouralgorithms more clearly the convergence curves of the fouralgorithms on Sphere Schwefelrsquos 222 Rastrigin GriewankQuartic and Ackley functions with D=100 are shown inFigure 2 Sphere and Schwefelrsquos 222 represent the two mostbasic unimodal functions Rastrigin and Griewank representthe two most basic multimodal functions It is seen thatHGGWA has higher precision and faster convergence speedthan other algorithms

523 Comparative Analysis of Running Time To evaluate theactual runtime of the several compared algorithms includingSSAWOAGWO andHGGWA their average running times(in seconds s) from 30 runs on function 1198911 1198912 1198916 1198917 1198918 1198919and 11989110 are plotted in Figure 3 The convergence accuracy ofeach test function has been described in Table 2

Figure 3 shows the convergence behavior of differentalgorithms Each point on the plot was calculated by takingthe average of 30 independent runs As can be seen fromFigure 3 several algorithms show different effects under thesame calculation accuracy For the functions 1198911 1198912 1198917 11989181198919 the GWO algorithm has the fastest convergence rateas a result of the advantages of the GWO algorithm itselfHowever for the test functions 1198916 and 11989110 the runningtime of the HGGWA algorithm is the shortest Throughfurther analysis it is known that HGGWA algorithm adds alittle computing time compared to GWO algorithm becauseof several improved strategies such as selection crossoverand mutation However the convergence speed of HGGWAalgorithm is still better than SSA and WOA algorithmsOverall the convergence speed of the HGGWA algorithmperforms better in several algorithms compared Moreoverthe running time of different test functions is very small andit is kept between 1s and 3s which shows that the HGGWAalgorithm has excellent stability

53 Results on CECrsquo2008 Benchmark Functions This sectionprovides an analysis of the effectiveness of HGGWA in termsof the CECrsquo2008 large-scale benchmark functions and acomparison with four powerful algorithms which have beenproven to be effective in solving large-scale optimizationproblems Experimental results are provided to analyze theperformance of HGGWA for large-scale optimization prob-lems

We tested our proposed algorithms with CECrsquo2008 func-tions for 100 500 and 1000 dimensions and the means ofthe best fitness value over 50 runs were recorded WhileF1 (Shifted Sphere) F4 (Shifted Rastrigin) and F6 (ShiftedAckley) are separable functions F2 (Schwefel Problem) F3(Shifted Rosenbrock) F5 (Shifted Griewank) and F7 (FastFractal) are nonseparable presenting a greater challenge toany algorithm that is prone to variable interactions Theperformance of HGGWA is compared with other algorithmsCCPSO2 [31] DEwSAcc [5] MLCC [38] and EPUS-PSO[49] for 100 500 and 1000 dimensions respectively Themaximum number of fitness evaluations (FEs) was calculatedby the following formula FEs = 5000timesD where D is thenumber of dimensions In order to reflect the fairness of thecomparison results the optimization results of the other fouralgorithms are directly derived from the original literatureThe specific comparison results are shown in Table 5

Table 5 shows the experimental results and the entriesshown in bold are significantly better than other algorithmsA general trend that can be seen is that HGGWA algorithmhas a promising optimization effect for most of the 7 func-tions

Generally speaking HGGWAoutperforms CCPSO2 on 6out of 7 functions with 100 and 500 dimensions respectivelyAmong them the result of HGGWA is slightly worse than

Complexity 11

Table4Com

paris

onresults

of10

high

-dim

ensio

nalb

enchmarkfunctio

nsby

WOASSA

ALO

and

HGGWA

Functio

nDim

WOA

SSA

ALO

HGGWA

Mean

Std

SRMean

Std

SRMean

Std

SRMean

Std

SR

119891 1100

577

E-96

469

E-96

100

101E-02

124E

-02

0304

E-01

231E-01

0485E-53

241E-52

100

500

724E

-93

298

E-94

100

324E+

04218E+

040

734E

+04

132E

+04

0835E-25

886E-25

100

1000

205

E-89

316E-90

100

119E

+05

104E

+05

0248E+

05212E+

050

175E

-17

273E-17

100

119891 2100

189E

-99

233E-99

100

885E+

00237E+

000

429E+

02325E+

020

401E-33

434E-33

100

500

138E

-99

216E-99

100

337E+

02217E+

020

231E+

03203E+

030

958E

-18

236E-18

100

1000

180E

-99

137E

-98

100

845E+

02343E+

020

325E+

04413E+

040

250E-11

166E

-11

100

119891 3100

384E+

01217E+

000

222E+

0110

5E+0

10

247E+

0113

4E+0

10

455E+

01369E+

010

500

932E

+01

532E+

010

327E+

01233E+

010

392E+

01234E+

010

722E

+01

501E+

000

1000

986E

+01

324E+

010

331E+

0112

8E+0

10

493E+

01399E+

010

831E+

01436E+

000

119891 4100

967E

+01

398E+

010

900E

+02

516E+

020

140E

+03

103E

+03

0682

E+01

125E

-01

100

500

495E+

02314E+

010

718E

+06

146E

+06

0209E+

0713

7E+0

70

367

E+02

823E-01

01000

991E+0

2248E+

020

298E+

0713

5E+0

70

152E

+08

100E

+07

0964

E+02

374E+

010

119891 5100

783E

+05

236E+

050

217E+

0410

3E+0

40

360

E+04

216E+

040

345

E-05

277

E-05

100

500

198E

+07

132E

+07

0429E+

05316E+

050

118E

+06

103E

+06

0267

E-03

438

E-03

80

1000

668E+

07344

E+07

0248E+

0612

3E+0

60

372E+

06349E+

060

833E+

00512E+

000

119891 6100

265E-04

143E

-04

100

112E

+00

103E

+00

0828E-01

268E-01

0234

E-09

317E-10

100

500

394E-04

322E-04

60

497E+

01366

E+01

0113E

+02

101E+0

20

573E-05

433E-06

100

1000

216E-03

136E

-03

0393E+

02278E+

020

177E

+03

233E+

030

108E

-04

162E

-03

80

119891 7100

000

E+00

000

E+00

100

627E+

01436E+

010

354E+

02332E+

020

000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

204

E+03

197E

+03

0334E+

03456E+

030

000

E+00

000

E+00

100

1000

000

E+00

000

E+00

100

569E+

0316

6E+0

30

719E

+03

214E+

030

557E-11

315E-12

100

119891 8100

444

E-15

312E-15

100

515E+

00413E+

000

493E+

00349E+

000

215E-15

348

E-15

100

500

444

E-15

305

E-15

100

126E

+01

100E

+01

012

8E+0

1228E+

010

874E-12

539E-12

100

1000

888

E-16

288

E-15

100

128E

+01

144E

+01

014

6E+0

1203E+

010

159E

-07

163E

-07

100

119891 9100

000

E+00

000

E+00

100

152E

-01

244

E-01

0454E-01

316E-01

0000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

282E+

02213E+

020

392E+

0210

8E+0

20

184E

-16

844

E-17

100

1000

000

E+00

000

E+00

100

110E

+03

134E

+03

0361E+

03117E

+03

0231E-13

329E-14

100

119891 10100

816E-03

432E-03

100

110E

+01

219E+

010

174E

+01

133E

+01

0273E-07

311E-07

100

500

252E-02

199E

-02

80

516E+

01333E+

010

397E+

05296E+

050

812E-05

498

E-06

100

1000

143E

-02

103E

-02

80

100E

+05

213E+

040

696E+

0613

4E+0

60

633E-04

377

E-03

100

12 Complexity

WOASSAALOHGGWA

500 10000Number of iterations

10minus100

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(a) Sphere function

WOASSAALOHGGWA

500 10000Number of iterations

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(b) Schwefel 222 function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(c) Rastrigin function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(d) Griewank function

WOASSAALOHGGWA

200 400 600 800 10000Number of iterations

10minus5

100

105

Obj

ectiv

e fun

ctio

n va

lue

(e) Quartic function

WOASSAALOHGGWA

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

500 10000Number of iterations

(f) Ackley function

Figure 2 Convergence curves of WOA SSA ALO and HGGWA algorithms

Comparative analysis of running time

459

344

053

13

388

239

072

186

537

366

317

285

423

233

122

216

399

217

057

139

259

218

069

121

463

333352

195

0

1

2

3

4

5

6

Runn

ing

Tim

es (s

)

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

SSAWOAGWOHGGWA(a) Comparative analysis of running time

Comparative analysis of iterations

302

233

198221

334290

263237

723

614

491

407 398

258

365312 263

234201

175

214185

166149

186

85

298

132

SSAWOAGWOHGGWA

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

100

200

300

400

500

600

700

Num

ber o

f ite

ratio

ns

(b) Comparative analysis of iterations

Figure 3 The average running times and iterations of SSA WOA GWO and HGGWA on different test functions

Complexity 13

Table5Com

paris

onresults

of7CE

Crsquo200

8benchm

arkfunctio

nsby

fivea

lgorith

ms

Functio

nDim

CCP

SO2

DEw

SAcc

MLC

CEP

US-PS

OHGGWA

Mean

Std

Mean

Std

Mean

Std

Mean

Std

Mean

Std

F1100

773E

-14

323E-14

568E-14

000

E+00

682E-14

232E-14

747E

-01

170E

-01

306

E-17

293E-17

500

300

E-13

796E

-14

209E-09

461E-09

429E-13

331E-14

845E+

01640

E+00

275E-15

213E-15

1000

518E-13

961E-14

878E-03

527E-03

845E-13

500

E-14

553E+

02286E+

0110

6E-10

183E

-11

F2100

608E+

0078

3E+0

0825E+

00532E+

00252E+

01872E+

0018

6E+0

1226E+

00271E+

00239

E+00

500

579E+

01421E+

0175

7E+0

1304

E+00

666

E+01

569E+

00435E+

01551E-01

312E+

01491E+

001000

782E

+01

425E+

0196

0E+0

118

1E+0

010

8E+0

2475E+

00466

E+01

400

E-01

271E+

01566

E+00

F3100

423E+

02865E+

0214

5E+0

2584E+

0114

9E+0

2572E+

01499E+

03535E+

03981E+

01225E-01

500

724E

+02

154E

+02

181E+0

3274E+

0292

4E+0

217

2E+0

2577E+

04804

E+03

496

E+02

623E-01

1000

133E

+03

263E+

02914E

+03

126E

+03

179E

+03

158E

+02

837E+

0515

2E+0

5997

E+02

358

E+01

F4100

398E-02

199E

-01

437E+

0076

5E+0

0438E-13

921E-14

471E+

02594E+

01000

E+00

000

E+00

500

398E-02

199E

-01

364

E+02

523E+

0117

9E-11

631E-11

349E+

03112E

+02

000

E+00

000

E+00

1000

199E

-01

406

E-01

182E

+03

137E

+02

137E

-10

337E-10

758E

+03

151E+0

2493E-11

287

E-12

F5100

345E-03

488E-03

307E-14

786E

-15

341E-14

116E

-14

372E-01

560

E-02

000

E+00

000

E+00

500

118E

-03

461E-03

690E-04

241E-03

212E-13

247E-14

164E

+00

469E-02

216E-16

756E

-17

1000

118E

-03

327E-03

358E-03

573E-03

418E-13

278

E-14

589E+

00391E-01

549E-13

235E-14

F6100

144E

-13

306

E-14

113E

-13

153E

-14

111E-13

786E

-15

206

E+00

440

E-01

321E-15

248

E-15

500

534E-13

861E-14

480E-01

573E-01

534

E-13

701E-14

664

E+00

449E-01

674E-12

439E-12

1000

102E

-12

168E

-13

229E+

00298E-01

106E

-12

768E

-14

189E

+01

249E+

00239E-07

213E-07

F7100

-150E

+03

104E

+01

-137

E+03

246

E+01

-154E

+03

252E+

00-855E

+02

135E

+01

-700e+0

2664

E-01

500

-723E

+03

416E+

01-574

E+03

183E

+02

-743E

+03

803E+

00-351E+0

3210E+

01-883e+0

3354

E+00

1000

-143E

+04

827E+

01-105E

+04

418E+

02-147E

+04

151E+0

1-662E

+03

318E+

01-369E

+04

216E+

02

14 Complexity

CCPSO2 for function F7 in the case of 100-dimensionalthe two algorithms obtain similar optimization results forthe function F6 in the case of 500-dimensional In additionat first glance it might seem that HGGWA and MLCChave similar performance However the HGGWA achievesbetter convergence accuracy than MLCC under the samenumber of iterations In particular the HGGWA algorithmrsquosoptimization results are superior toDEwSAcc andEPUS-PSOin all dimensions The result of HGGWA scaled very wellfrom 100-D to 1000-D on F1 F4 F5 and F6 outperformingDEwSAcc and EPUS-PSO on all the cases

54 Sensitivity Analysis of Nonlinear Adjustment Coefficient 119896This section mainly discusses the effect of nonlinear adjust-ment coefficients on the convergence performance of thealgorithm In the HGGWA algorithm the role of parameter119886 is to balance global exploration capabilities and localexploitation capabilities In (12) the nonlinear adjustmentcoefficient 119896 is a key parameter and is mainly used to controlthe range of variation of the convergence factor Thereforewe selected four different values for numerical experimentsto analyze the effect of nonlinear adjustment coefficients onthe performance of the algorithm The four different valuesare 051152 and the results of the comparison are shown inTable 6 Among them black bold represents the best result ofsolving in several algorithms

In general the value of the nonlinear adjustment coef-ficient 119896 has no significant effect on the performance ofthe HGGWA algorithm On closer inspection one can seethat besides the function 1198913 the optimization performanceof the algorithm is optimal when the nonlinear adjustmentcoefficient 119896 is 05 But for function 1198913 119896 = 15 is the bestchoice For functions 1198917 and 1198919 the HGGWA algorithmobtains the optimal solution 0 in the case of several differentvalues of the coefficient 119896 Therefore for most test functionsthe optimal value of the nonlinear adjustment factor 119896 is 0555 Global Convergence Analysis of HGGWA In theHGGWA algorithm the following four improved strategiesare used to ensure the global convergence of the algorithm(1) initial population strategy based on Opposition-BasedLearning (2) selection operation based on optimal retention(3) crossover operation based on population partitioningmechanism and (4) mutation operation for elite individ-uals

The idea of solving the high-dimensional function opti-mization problem byHGGWAalgorithm is as follows Firstlyfor the initial population strategy the basic GWO generatesthe initial population in a random manner However it willhave a great impact on the search efficiency of the algo-rithm for high-dimensional function optimization problemsThe algorithm cannot effectively search the entire problemsolution space if the initial grey wolf population is clusteredin a small range Therefore the Opposition-Based Learningstrategy is used to initialize the population which can makethe initial population evenly distributed in the solution spacethus laying a good foundation for the global search of thealgorithm Secondly the optimal retention strategy is usedto preserve the optimal individual of the parent population

in the process of population evolution As a result the nextgeneration of population can evolve in the optimal directionIn addition the individuals with higher fitness are selectedby the gambling roulette operation which maintains thedominance relationship between individuals in the popula-tion This dominance relationship makes the algorithm withgood global convergence Thirdly the crossover operation iscompleted under the condition of population partitioningin order to achieve the purpose of dimension reductionand maintain the diversity of the population Finally thesearch direction of the grey wolf population is guided by themutation operation of the elite individuals which effectivelyprevents the algorithm from falling into the local optimalsolution All in all through the improvement of these fourdifferent strategies the HGGWA algorithm shows goodglobal convergence in solving high-dimensional functionoptimization problems

6 Conclusion

In order to overcome the shortcomings of GWO algorithmfor solving large-scale global optimization problems whichare easy to fall into local optimum this paper proposes aHybrid Genetic GreyWolf Algorithm (HGGWA) by integrat-ing three genetic operators into the algorithm The improvedalgorithm initializes the population based on theOpposition-Based Learning strategy for improving the search efficiencyof the algorithm The strategy of population division basedon cooperative coevolution reduces the scale of the problemand the mutation operation of the elite individuals effectivelyprevents the algorithm from falling into the local optimaTheperformance of HGGWA has been evaluated using 10 classi-cal benchmark functions and 7 CECrsquo2008 high-dimensionalfunctions Fromour experimental results several conclusionscan be drawn

The results have shown that it is capable of groupinginteracting variables with great accuracy for the major-ity of the benchmark functions A comparative studyamong the HGGWA and other state-of-the-art algorithmswas conducted and the experimental results showed thatthe HGGWA algorithm exhibits better global convergencewhether it is solving separable functions or nonseparablefunctions(1)By testing 10 classical benchmark functions comparedwith the results of WOA SSA and ALO the HGGWA algo-rithm has been greatly improved in convergence accuracy Inaddition to the Schwefel problem and Rosenbrock functionthe HGGWA algorithm achieves an 80-100 successfulratio for all benchmark functions with 100 dimensionswhich proves the effectiveness of HGGWA for solving high-dimensional functions(2) By comparing with the optimization results on sevenCECrsquo2008 large-scale global optimization problems theresults show that HGGWA algorithm achieves the globaloptimal value for the separable functions F1 F4 and F6but the effect on other nonseparable problems is not verysatisfactory However HGGWAalgorithm still exhibits betterglobal convergence than the other algorithms CCPSO2DewSAcc MLCC and EPUS-PSO on most of the functions

Complexity 15

Table6Com

paris

onof

optim

izationperfo

rmance

ofdifferent119896val

uesfor

HGGWA

Functio

n119896=0

5119896=1

119896=15

119896=2Mean

Std

Mean

Std

Mean

Std

Mean

Std

119891 1843

E-60

536E-60

225E-54

203E-54

558E-53

431E-53

231E-51

316E-51

119891 2289

E-34

231E-34

206

E-33

189E

-33

470E-32

365E-32

576E-30

422E-30

119891 3111E+0

010

2E+0

0314E+

01219E+

01403E-01

238E-01

360

E+01

216E+

01119891 4

978

E+01

836E+

0198

2E+0

174

3E+0

198

1E+0

173

4E+0

198

3E+0

1461E+

01119891 5

218E-03

134E

-03

887E-02

667E-02

781E+0

0613E+

00174E

-02

234E-02

119891 613

1E-03

113E

-03

162E

-03

243E-04

299E-03

104E

-03

184E

-03

134E

-03

119891 7000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 815

8E-14

123E

-15

293E-14

137E

-14

222E-14

149E

-14

251E-14

108E

-14

119891 9000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 10299

E-02

249E-02

338E-02

124E

-03

842E-02

614E-02

729E

-02

813E-03

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 11: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

Complexity 11

Table4Com

paris

onresults

of10

high

-dim

ensio

nalb

enchmarkfunctio

nsby

WOASSA

ALO

and

HGGWA

Functio

nDim

WOA

SSA

ALO

HGGWA

Mean

Std

SRMean

Std

SRMean

Std

SRMean

Std

SR

119891 1100

577

E-96

469

E-96

100

101E-02

124E

-02

0304

E-01

231E-01

0485E-53

241E-52

100

500

724E

-93

298

E-94

100

324E+

04218E+

040

734E

+04

132E

+04

0835E-25

886E-25

100

1000

205

E-89

316E-90

100

119E

+05

104E

+05

0248E+

05212E+

050

175E

-17

273E-17

100

119891 2100

189E

-99

233E-99

100

885E+

00237E+

000

429E+

02325E+

020

401E-33

434E-33

100

500

138E

-99

216E-99

100

337E+

02217E+

020

231E+

03203E+

030

958E

-18

236E-18

100

1000

180E

-99

137E

-98

100

845E+

02343E+

020

325E+

04413E+

040

250E-11

166E

-11

100

119891 3100

384E+

01217E+

000

222E+

0110

5E+0

10

247E+

0113

4E+0

10

455E+

01369E+

010

500

932E

+01

532E+

010

327E+

01233E+

010

392E+

01234E+

010

722E

+01

501E+

000

1000

986E

+01

324E+

010

331E+

0112

8E+0

10

493E+

01399E+

010

831E+

01436E+

000

119891 4100

967E

+01

398E+

010

900E

+02

516E+

020

140E

+03

103E

+03

0682

E+01

125E

-01

100

500

495E+

02314E+

010

718E

+06

146E

+06

0209E+

0713

7E+0

70

367

E+02

823E-01

01000

991E+0

2248E+

020

298E+

0713

5E+0

70

152E

+08

100E

+07

0964

E+02

374E+

010

119891 5100

783E

+05

236E+

050

217E+

0410

3E+0

40

360

E+04

216E+

040

345

E-05

277

E-05

100

500

198E

+07

132E

+07

0429E+

05316E+

050

118E

+06

103E

+06

0267

E-03

438

E-03

80

1000

668E+

07344

E+07

0248E+

0612

3E+0

60

372E+

06349E+

060

833E+

00512E+

000

119891 6100

265E-04

143E

-04

100

112E

+00

103E

+00

0828E-01

268E-01

0234

E-09

317E-10

100

500

394E-04

322E-04

60

497E+

01366

E+01

0113E

+02

101E+0

20

573E-05

433E-06

100

1000

216E-03

136E

-03

0393E+

02278E+

020

177E

+03

233E+

030

108E

-04

162E

-03

80

119891 7100

000

E+00

000

E+00

100

627E+

01436E+

010

354E+

02332E+

020

000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

204

E+03

197E

+03

0334E+

03456E+

030

000

E+00

000

E+00

100

1000

000

E+00

000

E+00

100

569E+

0316

6E+0

30

719E

+03

214E+

030

557E-11

315E-12

100

119891 8100

444

E-15

312E-15

100

515E+

00413E+

000

493E+

00349E+

000

215E-15

348

E-15

100

500

444

E-15

305

E-15

100

126E

+01

100E

+01

012

8E+0

1228E+

010

874E-12

539E-12

100

1000

888

E-16

288

E-15

100

128E

+01

144E

+01

014

6E+0

1203E+

010

159E

-07

163E

-07

100

119891 9100

000

E+00

000

E+00

100

152E

-01

244

E-01

0454E-01

316E-01

0000

E+00

000

E+00

100

500

000

E+00

000

E+00

100

282E+

02213E+

020

392E+

0210

8E+0

20

184E

-16

844

E-17

100

1000

000

E+00

000

E+00

100

110E

+03

134E

+03

0361E+

03117E

+03

0231E-13

329E-14

100

119891 10100

816E-03

432E-03

100

110E

+01

219E+

010

174E

+01

133E

+01

0273E-07

311E-07

100

500

252E-02

199E

-02

80

516E+

01333E+

010

397E+

05296E+

050

812E-05

498

E-06

100

1000

143E

-02

103E

-02

80

100E

+05

213E+

040

696E+

0613

4E+0

60

633E-04

377

E-03

100

12 Complexity

WOASSAALOHGGWA

500 10000Number of iterations

10minus100

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(a) Sphere function

WOASSAALOHGGWA

500 10000Number of iterations

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(b) Schwefel 222 function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(c) Rastrigin function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(d) Griewank function

WOASSAALOHGGWA

200 400 600 800 10000Number of iterations

10minus5

100

105

Obj

ectiv

e fun

ctio

n va

lue

(e) Quartic function

WOASSAALOHGGWA

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

500 10000Number of iterations

(f) Ackley function

Figure 2 Convergence curves of WOA SSA ALO and HGGWA algorithms

Comparative analysis of running time

459

344

053

13

388

239

072

186

537

366

317

285

423

233

122

216

399

217

057

139

259

218

069

121

463

333352

195

0

1

2

3

4

5

6

Runn

ing

Tim

es (s

)

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

SSAWOAGWOHGGWA(a) Comparative analysis of running time

Comparative analysis of iterations

302

233

198221

334290

263237

723

614

491

407 398

258

365312 263

234201

175

214185

166149

186

85

298

132

SSAWOAGWOHGGWA

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

100

200

300

400

500

600

700

Num

ber o

f ite

ratio

ns

(b) Comparative analysis of iterations

Figure 3 The average running times and iterations of SSA WOA GWO and HGGWA on different test functions

Complexity 13

Table5Com

paris

onresults

of7CE

Crsquo200

8benchm

arkfunctio

nsby

fivea

lgorith

ms

Functio

nDim

CCP

SO2

DEw

SAcc

MLC

CEP

US-PS

OHGGWA

Mean

Std

Mean

Std

Mean

Std

Mean

Std

Mean

Std

F1100

773E

-14

323E-14

568E-14

000

E+00

682E-14

232E-14

747E

-01

170E

-01

306

E-17

293E-17

500

300

E-13

796E

-14

209E-09

461E-09

429E-13

331E-14

845E+

01640

E+00

275E-15

213E-15

1000

518E-13

961E-14

878E-03

527E-03

845E-13

500

E-14

553E+

02286E+

0110

6E-10

183E

-11

F2100

608E+

0078

3E+0

0825E+

00532E+

00252E+

01872E+

0018

6E+0

1226E+

00271E+

00239

E+00

500

579E+

01421E+

0175

7E+0

1304

E+00

666

E+01

569E+

00435E+

01551E-01

312E+

01491E+

001000

782E

+01

425E+

0196

0E+0

118

1E+0

010

8E+0

2475E+

00466

E+01

400

E-01

271E+

01566

E+00

F3100

423E+

02865E+

0214

5E+0

2584E+

0114

9E+0

2572E+

01499E+

03535E+

03981E+

01225E-01

500

724E

+02

154E

+02

181E+0

3274E+

0292

4E+0

217

2E+0

2577E+

04804

E+03

496

E+02

623E-01

1000

133E

+03

263E+

02914E

+03

126E

+03

179E

+03

158E

+02

837E+

0515

2E+0

5997

E+02

358

E+01

F4100

398E-02

199E

-01

437E+

0076

5E+0

0438E-13

921E-14

471E+

02594E+

01000

E+00

000

E+00

500

398E-02

199E

-01

364

E+02

523E+

0117

9E-11

631E-11

349E+

03112E

+02

000

E+00

000

E+00

1000

199E

-01

406

E-01

182E

+03

137E

+02

137E

-10

337E-10

758E

+03

151E+0

2493E-11

287

E-12

F5100

345E-03

488E-03

307E-14

786E

-15

341E-14

116E

-14

372E-01

560

E-02

000

E+00

000

E+00

500

118E

-03

461E-03

690E-04

241E-03

212E-13

247E-14

164E

+00

469E-02

216E-16

756E

-17

1000

118E

-03

327E-03

358E-03

573E-03

418E-13

278

E-14

589E+

00391E-01

549E-13

235E-14

F6100

144E

-13

306

E-14

113E

-13

153E

-14

111E-13

786E

-15

206

E+00

440

E-01

321E-15

248

E-15

500

534E-13

861E-14

480E-01

573E-01

534

E-13

701E-14

664

E+00

449E-01

674E-12

439E-12

1000

102E

-12

168E

-13

229E+

00298E-01

106E

-12

768E

-14

189E

+01

249E+

00239E-07

213E-07

F7100

-150E

+03

104E

+01

-137

E+03

246

E+01

-154E

+03

252E+

00-855E

+02

135E

+01

-700e+0

2664

E-01

500

-723E

+03

416E+

01-574

E+03

183E

+02

-743E

+03

803E+

00-351E+0

3210E+

01-883e+0

3354

E+00

1000

-143E

+04

827E+

01-105E

+04

418E+

02-147E

+04

151E+0

1-662E

+03

318E+

01-369E

+04

216E+

02

14 Complexity

CCPSO2 for function F7 in the case of 100-dimensionalthe two algorithms obtain similar optimization results forthe function F6 in the case of 500-dimensional In additionat first glance it might seem that HGGWA and MLCChave similar performance However the HGGWA achievesbetter convergence accuracy than MLCC under the samenumber of iterations In particular the HGGWA algorithmrsquosoptimization results are superior toDEwSAcc andEPUS-PSOin all dimensions The result of HGGWA scaled very wellfrom 100-D to 1000-D on F1 F4 F5 and F6 outperformingDEwSAcc and EPUS-PSO on all the cases

54 Sensitivity Analysis of Nonlinear Adjustment Coefficient 119896This section mainly discusses the effect of nonlinear adjust-ment coefficients on the convergence performance of thealgorithm In the HGGWA algorithm the role of parameter119886 is to balance global exploration capabilities and localexploitation capabilities In (12) the nonlinear adjustmentcoefficient 119896 is a key parameter and is mainly used to controlthe range of variation of the convergence factor Thereforewe selected four different values for numerical experimentsto analyze the effect of nonlinear adjustment coefficients onthe performance of the algorithm The four different valuesare 051152 and the results of the comparison are shown inTable 6 Among them black bold represents the best result ofsolving in several algorithms

In general the value of the nonlinear adjustment coef-ficient 119896 has no significant effect on the performance ofthe HGGWA algorithm On closer inspection one can seethat besides the function 1198913 the optimization performanceof the algorithm is optimal when the nonlinear adjustmentcoefficient 119896 is 05 But for function 1198913 119896 = 15 is the bestchoice For functions 1198917 and 1198919 the HGGWA algorithmobtains the optimal solution 0 in the case of several differentvalues of the coefficient 119896 Therefore for most test functionsthe optimal value of the nonlinear adjustment factor 119896 is 0555 Global Convergence Analysis of HGGWA In theHGGWA algorithm the following four improved strategiesare used to ensure the global convergence of the algorithm(1) initial population strategy based on Opposition-BasedLearning (2) selection operation based on optimal retention(3) crossover operation based on population partitioningmechanism and (4) mutation operation for elite individ-uals

The idea of solving the high-dimensional function opti-mization problem byHGGWAalgorithm is as follows Firstlyfor the initial population strategy the basic GWO generatesthe initial population in a random manner However it willhave a great impact on the search efficiency of the algo-rithm for high-dimensional function optimization problemsThe algorithm cannot effectively search the entire problemsolution space if the initial grey wolf population is clusteredin a small range Therefore the Opposition-Based Learningstrategy is used to initialize the population which can makethe initial population evenly distributed in the solution spacethus laying a good foundation for the global search of thealgorithm Secondly the optimal retention strategy is usedto preserve the optimal individual of the parent population

in the process of population evolution As a result the nextgeneration of population can evolve in the optimal directionIn addition the individuals with higher fitness are selectedby the gambling roulette operation which maintains thedominance relationship between individuals in the popula-tion This dominance relationship makes the algorithm withgood global convergence Thirdly the crossover operation iscompleted under the condition of population partitioningin order to achieve the purpose of dimension reductionand maintain the diversity of the population Finally thesearch direction of the grey wolf population is guided by themutation operation of the elite individuals which effectivelyprevents the algorithm from falling into the local optimalsolution All in all through the improvement of these fourdifferent strategies the HGGWA algorithm shows goodglobal convergence in solving high-dimensional functionoptimization problems

6 Conclusion

In order to overcome the shortcomings of GWO algorithmfor solving large-scale global optimization problems whichare easy to fall into local optimum this paper proposes aHybrid Genetic GreyWolf Algorithm (HGGWA) by integrat-ing three genetic operators into the algorithm The improvedalgorithm initializes the population based on theOpposition-Based Learning strategy for improving the search efficiencyof the algorithm The strategy of population division basedon cooperative coevolution reduces the scale of the problemand the mutation operation of the elite individuals effectivelyprevents the algorithm from falling into the local optimaTheperformance of HGGWA has been evaluated using 10 classi-cal benchmark functions and 7 CECrsquo2008 high-dimensionalfunctions Fromour experimental results several conclusionscan be drawn

The results have shown that it is capable of groupinginteracting variables with great accuracy for the major-ity of the benchmark functions A comparative studyamong the HGGWA and other state-of-the-art algorithmswas conducted and the experimental results showed thatthe HGGWA algorithm exhibits better global convergencewhether it is solving separable functions or nonseparablefunctions(1)By testing 10 classical benchmark functions comparedwith the results of WOA SSA and ALO the HGGWA algo-rithm has been greatly improved in convergence accuracy Inaddition to the Schwefel problem and Rosenbrock functionthe HGGWA algorithm achieves an 80-100 successfulratio for all benchmark functions with 100 dimensionswhich proves the effectiveness of HGGWA for solving high-dimensional functions(2) By comparing with the optimization results on sevenCECrsquo2008 large-scale global optimization problems theresults show that HGGWA algorithm achieves the globaloptimal value for the separable functions F1 F4 and F6but the effect on other nonseparable problems is not verysatisfactory However HGGWAalgorithm still exhibits betterglobal convergence than the other algorithms CCPSO2DewSAcc MLCC and EPUS-PSO on most of the functions

Complexity 15

Table6Com

paris

onof

optim

izationperfo

rmance

ofdifferent119896val

uesfor

HGGWA

Functio

n119896=0

5119896=1

119896=15

119896=2Mean

Std

Mean

Std

Mean

Std

Mean

Std

119891 1843

E-60

536E-60

225E-54

203E-54

558E-53

431E-53

231E-51

316E-51

119891 2289

E-34

231E-34

206

E-33

189E

-33

470E-32

365E-32

576E-30

422E-30

119891 3111E+0

010

2E+0

0314E+

01219E+

01403E-01

238E-01

360

E+01

216E+

01119891 4

978

E+01

836E+

0198

2E+0

174

3E+0

198

1E+0

173

4E+0

198

3E+0

1461E+

01119891 5

218E-03

134E

-03

887E-02

667E-02

781E+0

0613E+

00174E

-02

234E-02

119891 613

1E-03

113E

-03

162E

-03

243E-04

299E-03

104E

-03

184E

-03

134E

-03

119891 7000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 815

8E-14

123E

-15

293E-14

137E

-14

222E-14

149E

-14

251E-14

108E

-14

119891 9000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 10299

E-02

249E-02

338E-02

124E

-03

842E-02

614E-02

729E

-02

813E-03

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 12: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

12 Complexity

WOASSAALOHGGWA

500 10000Number of iterations

10minus100

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(a) Sphere function

WOASSAALOHGGWA

500 10000Number of iterations

10minus50

100

1050

Obj

ectiv

e fun

ctio

n va

lue

(b) Schwefel 222 function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(c) Rastrigin function

WOASSAALOHGGWA

500 10000Number of iterations

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

(d) Griewank function

WOASSAALOHGGWA

200 400 600 800 10000Number of iterations

10minus5

100

105

Obj

ectiv

e fun

ctio

n va

lue

(e) Quartic function

WOASSAALOHGGWA

10minus20

10minus10

100

1010

Obj

ectiv

e fun

ctio

n va

lue

500 10000Number of iterations

(f) Ackley function

Figure 2 Convergence curves of WOA SSA ALO and HGGWA algorithms

Comparative analysis of running time

459

344

053

13

388

239

072

186

537

366

317

285

423

233

122

216

399

217

057

139

259

218

069

121

463

333352

195

0

1

2

3

4

5

6

Runn

ing

Tim

es (s

)

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

SSAWOAGWOHGGWA(a) Comparative analysis of running time

Comparative analysis of iterations

302

233

198221

334290

263237

723

614

491

407 398

258

365312 263

234201

175

214185

166149

186

85

298

132

SSAWOAGWOHGGWA

f2 f6 f7 f8 f9 f10f1Benchmark Functions (D=100)

100

200

300

400

500

600

700

Num

ber o

f ite

ratio

ns

(b) Comparative analysis of iterations

Figure 3 The average running times and iterations of SSA WOA GWO and HGGWA on different test functions

Complexity 13

Table5Com

paris

onresults

of7CE

Crsquo200

8benchm

arkfunctio

nsby

fivea

lgorith

ms

Functio

nDim

CCP

SO2

DEw

SAcc

MLC

CEP

US-PS

OHGGWA

Mean

Std

Mean

Std

Mean

Std

Mean

Std

Mean

Std

F1100

773E

-14

323E-14

568E-14

000

E+00

682E-14

232E-14

747E

-01

170E

-01

306

E-17

293E-17

500

300

E-13

796E

-14

209E-09

461E-09

429E-13

331E-14

845E+

01640

E+00

275E-15

213E-15

1000

518E-13

961E-14

878E-03

527E-03

845E-13

500

E-14

553E+

02286E+

0110

6E-10

183E

-11

F2100

608E+

0078

3E+0

0825E+

00532E+

00252E+

01872E+

0018

6E+0

1226E+

00271E+

00239

E+00

500

579E+

01421E+

0175

7E+0

1304

E+00

666

E+01

569E+

00435E+

01551E-01

312E+

01491E+

001000

782E

+01

425E+

0196

0E+0

118

1E+0

010

8E+0

2475E+

00466

E+01

400

E-01

271E+

01566

E+00

F3100

423E+

02865E+

0214

5E+0

2584E+

0114

9E+0

2572E+

01499E+

03535E+

03981E+

01225E-01

500

724E

+02

154E

+02

181E+0

3274E+

0292

4E+0

217

2E+0

2577E+

04804

E+03

496

E+02

623E-01

1000

133E

+03

263E+

02914E

+03

126E

+03

179E

+03

158E

+02

837E+

0515

2E+0

5997

E+02

358

E+01

F4100

398E-02

199E

-01

437E+

0076

5E+0

0438E-13

921E-14

471E+

02594E+

01000

E+00

000

E+00

500

398E-02

199E

-01

364

E+02

523E+

0117

9E-11

631E-11

349E+

03112E

+02

000

E+00

000

E+00

1000

199E

-01

406

E-01

182E

+03

137E

+02

137E

-10

337E-10

758E

+03

151E+0

2493E-11

287

E-12

F5100

345E-03

488E-03

307E-14

786E

-15

341E-14

116E

-14

372E-01

560

E-02

000

E+00

000

E+00

500

118E

-03

461E-03

690E-04

241E-03

212E-13

247E-14

164E

+00

469E-02

216E-16

756E

-17

1000

118E

-03

327E-03

358E-03

573E-03

418E-13

278

E-14

589E+

00391E-01

549E-13

235E-14

F6100

144E

-13

306

E-14

113E

-13

153E

-14

111E-13

786E

-15

206

E+00

440

E-01

321E-15

248

E-15

500

534E-13

861E-14

480E-01

573E-01

534

E-13

701E-14

664

E+00

449E-01

674E-12

439E-12

1000

102E

-12

168E

-13

229E+

00298E-01

106E

-12

768E

-14

189E

+01

249E+

00239E-07

213E-07

F7100

-150E

+03

104E

+01

-137

E+03

246

E+01

-154E

+03

252E+

00-855E

+02

135E

+01

-700e+0

2664

E-01

500

-723E

+03

416E+

01-574

E+03

183E

+02

-743E

+03

803E+

00-351E+0

3210E+

01-883e+0

3354

E+00

1000

-143E

+04

827E+

01-105E

+04

418E+

02-147E

+04

151E+0

1-662E

+03

318E+

01-369E

+04

216E+

02

14 Complexity

CCPSO2 for function F7 in the case of 100-dimensionalthe two algorithms obtain similar optimization results forthe function F6 in the case of 500-dimensional In additionat first glance it might seem that HGGWA and MLCChave similar performance However the HGGWA achievesbetter convergence accuracy than MLCC under the samenumber of iterations In particular the HGGWA algorithmrsquosoptimization results are superior toDEwSAcc andEPUS-PSOin all dimensions The result of HGGWA scaled very wellfrom 100-D to 1000-D on F1 F4 F5 and F6 outperformingDEwSAcc and EPUS-PSO on all the cases

54 Sensitivity Analysis of Nonlinear Adjustment Coefficient 119896This section mainly discusses the effect of nonlinear adjust-ment coefficients on the convergence performance of thealgorithm In the HGGWA algorithm the role of parameter119886 is to balance global exploration capabilities and localexploitation capabilities In (12) the nonlinear adjustmentcoefficient 119896 is a key parameter and is mainly used to controlthe range of variation of the convergence factor Thereforewe selected four different values for numerical experimentsto analyze the effect of nonlinear adjustment coefficients onthe performance of the algorithm The four different valuesare 051152 and the results of the comparison are shown inTable 6 Among them black bold represents the best result ofsolving in several algorithms

In general the value of the nonlinear adjustment coef-ficient 119896 has no significant effect on the performance ofthe HGGWA algorithm On closer inspection one can seethat besides the function 1198913 the optimization performanceof the algorithm is optimal when the nonlinear adjustmentcoefficient 119896 is 05 But for function 1198913 119896 = 15 is the bestchoice For functions 1198917 and 1198919 the HGGWA algorithmobtains the optimal solution 0 in the case of several differentvalues of the coefficient 119896 Therefore for most test functionsthe optimal value of the nonlinear adjustment factor 119896 is 0555 Global Convergence Analysis of HGGWA In theHGGWA algorithm the following four improved strategiesare used to ensure the global convergence of the algorithm(1) initial population strategy based on Opposition-BasedLearning (2) selection operation based on optimal retention(3) crossover operation based on population partitioningmechanism and (4) mutation operation for elite individ-uals

The idea of solving the high-dimensional function opti-mization problem byHGGWAalgorithm is as follows Firstlyfor the initial population strategy the basic GWO generatesthe initial population in a random manner However it willhave a great impact on the search efficiency of the algo-rithm for high-dimensional function optimization problemsThe algorithm cannot effectively search the entire problemsolution space if the initial grey wolf population is clusteredin a small range Therefore the Opposition-Based Learningstrategy is used to initialize the population which can makethe initial population evenly distributed in the solution spacethus laying a good foundation for the global search of thealgorithm Secondly the optimal retention strategy is usedto preserve the optimal individual of the parent population

in the process of population evolution As a result the nextgeneration of population can evolve in the optimal directionIn addition the individuals with higher fitness are selectedby the gambling roulette operation which maintains thedominance relationship between individuals in the popula-tion This dominance relationship makes the algorithm withgood global convergence Thirdly the crossover operation iscompleted under the condition of population partitioningin order to achieve the purpose of dimension reductionand maintain the diversity of the population Finally thesearch direction of the grey wolf population is guided by themutation operation of the elite individuals which effectivelyprevents the algorithm from falling into the local optimalsolution All in all through the improvement of these fourdifferent strategies the HGGWA algorithm shows goodglobal convergence in solving high-dimensional functionoptimization problems

6 Conclusion

In order to overcome the shortcomings of GWO algorithmfor solving large-scale global optimization problems whichare easy to fall into local optimum this paper proposes aHybrid Genetic GreyWolf Algorithm (HGGWA) by integrat-ing three genetic operators into the algorithm The improvedalgorithm initializes the population based on theOpposition-Based Learning strategy for improving the search efficiencyof the algorithm The strategy of population division basedon cooperative coevolution reduces the scale of the problemand the mutation operation of the elite individuals effectivelyprevents the algorithm from falling into the local optimaTheperformance of HGGWA has been evaluated using 10 classi-cal benchmark functions and 7 CECrsquo2008 high-dimensionalfunctions Fromour experimental results several conclusionscan be drawn

The results have shown that it is capable of groupinginteracting variables with great accuracy for the major-ity of the benchmark functions A comparative studyamong the HGGWA and other state-of-the-art algorithmswas conducted and the experimental results showed thatthe HGGWA algorithm exhibits better global convergencewhether it is solving separable functions or nonseparablefunctions(1)By testing 10 classical benchmark functions comparedwith the results of WOA SSA and ALO the HGGWA algo-rithm has been greatly improved in convergence accuracy Inaddition to the Schwefel problem and Rosenbrock functionthe HGGWA algorithm achieves an 80-100 successfulratio for all benchmark functions with 100 dimensionswhich proves the effectiveness of HGGWA for solving high-dimensional functions(2) By comparing with the optimization results on sevenCECrsquo2008 large-scale global optimization problems theresults show that HGGWA algorithm achieves the globaloptimal value for the separable functions F1 F4 and F6but the effect on other nonseparable problems is not verysatisfactory However HGGWAalgorithm still exhibits betterglobal convergence than the other algorithms CCPSO2DewSAcc MLCC and EPUS-PSO on most of the functions

Complexity 15

Table6Com

paris

onof

optim

izationperfo

rmance

ofdifferent119896val

uesfor

HGGWA

Functio

n119896=0

5119896=1

119896=15

119896=2Mean

Std

Mean

Std

Mean

Std

Mean

Std

119891 1843

E-60

536E-60

225E-54

203E-54

558E-53

431E-53

231E-51

316E-51

119891 2289

E-34

231E-34

206

E-33

189E

-33

470E-32

365E-32

576E-30

422E-30

119891 3111E+0

010

2E+0

0314E+

01219E+

01403E-01

238E-01

360

E+01

216E+

01119891 4

978

E+01

836E+

0198

2E+0

174

3E+0

198

1E+0

173

4E+0

198

3E+0

1461E+

01119891 5

218E-03

134E

-03

887E-02

667E-02

781E+0

0613E+

00174E

-02

234E-02

119891 613

1E-03

113E

-03

162E

-03

243E-04

299E-03

104E

-03

184E

-03

134E

-03

119891 7000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 815

8E-14

123E

-15

293E-14

137E

-14

222E-14

149E

-14

251E-14

108E

-14

119891 9000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 10299

E-02

249E-02

338E-02

124E

-03

842E-02

614E-02

729E

-02

813E-03

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 13: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

Complexity 13

Table5Com

paris

onresults

of7CE

Crsquo200

8benchm

arkfunctio

nsby

fivea

lgorith

ms

Functio

nDim

CCP

SO2

DEw

SAcc

MLC

CEP

US-PS

OHGGWA

Mean

Std

Mean

Std

Mean

Std

Mean

Std

Mean

Std

F1100

773E

-14

323E-14

568E-14

000

E+00

682E-14

232E-14

747E

-01

170E

-01

306

E-17

293E-17

500

300

E-13

796E

-14

209E-09

461E-09

429E-13

331E-14

845E+

01640

E+00

275E-15

213E-15

1000

518E-13

961E-14

878E-03

527E-03

845E-13

500

E-14

553E+

02286E+

0110

6E-10

183E

-11

F2100

608E+

0078

3E+0

0825E+

00532E+

00252E+

01872E+

0018

6E+0

1226E+

00271E+

00239

E+00

500

579E+

01421E+

0175

7E+0

1304

E+00

666

E+01

569E+

00435E+

01551E-01

312E+

01491E+

001000

782E

+01

425E+

0196

0E+0

118

1E+0

010

8E+0

2475E+

00466

E+01

400

E-01

271E+

01566

E+00

F3100

423E+

02865E+

0214

5E+0

2584E+

0114

9E+0

2572E+

01499E+

03535E+

03981E+

01225E-01

500

724E

+02

154E

+02

181E+0

3274E+

0292

4E+0

217

2E+0

2577E+

04804

E+03

496

E+02

623E-01

1000

133E

+03

263E+

02914E

+03

126E

+03

179E

+03

158E

+02

837E+

0515

2E+0

5997

E+02

358

E+01

F4100

398E-02

199E

-01

437E+

0076

5E+0

0438E-13

921E-14

471E+

02594E+

01000

E+00

000

E+00

500

398E-02

199E

-01

364

E+02

523E+

0117

9E-11

631E-11

349E+

03112E

+02

000

E+00

000

E+00

1000

199E

-01

406

E-01

182E

+03

137E

+02

137E

-10

337E-10

758E

+03

151E+0

2493E-11

287

E-12

F5100

345E-03

488E-03

307E-14

786E

-15

341E-14

116E

-14

372E-01

560

E-02

000

E+00

000

E+00

500

118E

-03

461E-03

690E-04

241E-03

212E-13

247E-14

164E

+00

469E-02

216E-16

756E

-17

1000

118E

-03

327E-03

358E-03

573E-03

418E-13

278

E-14

589E+

00391E-01

549E-13

235E-14

F6100

144E

-13

306

E-14

113E

-13

153E

-14

111E-13

786E

-15

206

E+00

440

E-01

321E-15

248

E-15

500

534E-13

861E-14

480E-01

573E-01

534

E-13

701E-14

664

E+00

449E-01

674E-12

439E-12

1000

102E

-12

168E

-13

229E+

00298E-01

106E

-12

768E

-14

189E

+01

249E+

00239E-07

213E-07

F7100

-150E

+03

104E

+01

-137

E+03

246

E+01

-154E

+03

252E+

00-855E

+02

135E

+01

-700e+0

2664

E-01

500

-723E

+03

416E+

01-574

E+03

183E

+02

-743E

+03

803E+

00-351E+0

3210E+

01-883e+0

3354

E+00

1000

-143E

+04

827E+

01-105E

+04

418E+

02-147E

+04

151E+0

1-662E

+03

318E+

01-369E

+04

216E+

02

14 Complexity

CCPSO2 for function F7 in the case of 100-dimensionalthe two algorithms obtain similar optimization results forthe function F6 in the case of 500-dimensional In additionat first glance it might seem that HGGWA and MLCChave similar performance However the HGGWA achievesbetter convergence accuracy than MLCC under the samenumber of iterations In particular the HGGWA algorithmrsquosoptimization results are superior toDEwSAcc andEPUS-PSOin all dimensions The result of HGGWA scaled very wellfrom 100-D to 1000-D on F1 F4 F5 and F6 outperformingDEwSAcc and EPUS-PSO on all the cases

54 Sensitivity Analysis of Nonlinear Adjustment Coefficient 119896This section mainly discusses the effect of nonlinear adjust-ment coefficients on the convergence performance of thealgorithm In the HGGWA algorithm the role of parameter119886 is to balance global exploration capabilities and localexploitation capabilities In (12) the nonlinear adjustmentcoefficient 119896 is a key parameter and is mainly used to controlthe range of variation of the convergence factor Thereforewe selected four different values for numerical experimentsto analyze the effect of nonlinear adjustment coefficients onthe performance of the algorithm The four different valuesare 051152 and the results of the comparison are shown inTable 6 Among them black bold represents the best result ofsolving in several algorithms

In general the value of the nonlinear adjustment coef-ficient 119896 has no significant effect on the performance ofthe HGGWA algorithm On closer inspection one can seethat besides the function 1198913 the optimization performanceof the algorithm is optimal when the nonlinear adjustmentcoefficient 119896 is 05 But for function 1198913 119896 = 15 is the bestchoice For functions 1198917 and 1198919 the HGGWA algorithmobtains the optimal solution 0 in the case of several differentvalues of the coefficient 119896 Therefore for most test functionsthe optimal value of the nonlinear adjustment factor 119896 is 0555 Global Convergence Analysis of HGGWA In theHGGWA algorithm the following four improved strategiesare used to ensure the global convergence of the algorithm(1) initial population strategy based on Opposition-BasedLearning (2) selection operation based on optimal retention(3) crossover operation based on population partitioningmechanism and (4) mutation operation for elite individ-uals

The idea of solving the high-dimensional function opti-mization problem byHGGWAalgorithm is as follows Firstlyfor the initial population strategy the basic GWO generatesthe initial population in a random manner However it willhave a great impact on the search efficiency of the algo-rithm for high-dimensional function optimization problemsThe algorithm cannot effectively search the entire problemsolution space if the initial grey wolf population is clusteredin a small range Therefore the Opposition-Based Learningstrategy is used to initialize the population which can makethe initial population evenly distributed in the solution spacethus laying a good foundation for the global search of thealgorithm Secondly the optimal retention strategy is usedto preserve the optimal individual of the parent population

in the process of population evolution As a result the nextgeneration of population can evolve in the optimal directionIn addition the individuals with higher fitness are selectedby the gambling roulette operation which maintains thedominance relationship between individuals in the popula-tion This dominance relationship makes the algorithm withgood global convergence Thirdly the crossover operation iscompleted under the condition of population partitioningin order to achieve the purpose of dimension reductionand maintain the diversity of the population Finally thesearch direction of the grey wolf population is guided by themutation operation of the elite individuals which effectivelyprevents the algorithm from falling into the local optimalsolution All in all through the improvement of these fourdifferent strategies the HGGWA algorithm shows goodglobal convergence in solving high-dimensional functionoptimization problems

6 Conclusion

In order to overcome the shortcomings of GWO algorithmfor solving large-scale global optimization problems whichare easy to fall into local optimum this paper proposes aHybrid Genetic GreyWolf Algorithm (HGGWA) by integrat-ing three genetic operators into the algorithm The improvedalgorithm initializes the population based on theOpposition-Based Learning strategy for improving the search efficiencyof the algorithm The strategy of population division basedon cooperative coevolution reduces the scale of the problemand the mutation operation of the elite individuals effectivelyprevents the algorithm from falling into the local optimaTheperformance of HGGWA has been evaluated using 10 classi-cal benchmark functions and 7 CECrsquo2008 high-dimensionalfunctions Fromour experimental results several conclusionscan be drawn

The results have shown that it is capable of groupinginteracting variables with great accuracy for the major-ity of the benchmark functions A comparative studyamong the HGGWA and other state-of-the-art algorithmswas conducted and the experimental results showed thatthe HGGWA algorithm exhibits better global convergencewhether it is solving separable functions or nonseparablefunctions(1)By testing 10 classical benchmark functions comparedwith the results of WOA SSA and ALO the HGGWA algo-rithm has been greatly improved in convergence accuracy Inaddition to the Schwefel problem and Rosenbrock functionthe HGGWA algorithm achieves an 80-100 successfulratio for all benchmark functions with 100 dimensionswhich proves the effectiveness of HGGWA for solving high-dimensional functions(2) By comparing with the optimization results on sevenCECrsquo2008 large-scale global optimization problems theresults show that HGGWA algorithm achieves the globaloptimal value for the separable functions F1 F4 and F6but the effect on other nonseparable problems is not verysatisfactory However HGGWAalgorithm still exhibits betterglobal convergence than the other algorithms CCPSO2DewSAcc MLCC and EPUS-PSO on most of the functions

Complexity 15

Table6Com

paris

onof

optim

izationperfo

rmance

ofdifferent119896val

uesfor

HGGWA

Functio

n119896=0

5119896=1

119896=15

119896=2Mean

Std

Mean

Std

Mean

Std

Mean

Std

119891 1843

E-60

536E-60

225E-54

203E-54

558E-53

431E-53

231E-51

316E-51

119891 2289

E-34

231E-34

206

E-33

189E

-33

470E-32

365E-32

576E-30

422E-30

119891 3111E+0

010

2E+0

0314E+

01219E+

01403E-01

238E-01

360

E+01

216E+

01119891 4

978

E+01

836E+

0198

2E+0

174

3E+0

198

1E+0

173

4E+0

198

3E+0

1461E+

01119891 5

218E-03

134E

-03

887E-02

667E-02

781E+0

0613E+

00174E

-02

234E-02

119891 613

1E-03

113E

-03

162E

-03

243E-04

299E-03

104E

-03

184E

-03

134E

-03

119891 7000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 815

8E-14

123E

-15

293E-14

137E

-14

222E-14

149E

-14

251E-14

108E

-14

119891 9000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 10299

E-02

249E-02

338E-02

124E

-03

842E-02

614E-02

729E

-02

813E-03

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 14: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

14 Complexity

CCPSO2 for function F7 in the case of 100-dimensionalthe two algorithms obtain similar optimization results forthe function F6 in the case of 500-dimensional In additionat first glance it might seem that HGGWA and MLCChave similar performance However the HGGWA achievesbetter convergence accuracy than MLCC under the samenumber of iterations In particular the HGGWA algorithmrsquosoptimization results are superior toDEwSAcc andEPUS-PSOin all dimensions The result of HGGWA scaled very wellfrom 100-D to 1000-D on F1 F4 F5 and F6 outperformingDEwSAcc and EPUS-PSO on all the cases

54 Sensitivity Analysis of Nonlinear Adjustment Coefficient 119896This section mainly discusses the effect of nonlinear adjust-ment coefficients on the convergence performance of thealgorithm In the HGGWA algorithm the role of parameter119886 is to balance global exploration capabilities and localexploitation capabilities In (12) the nonlinear adjustmentcoefficient 119896 is a key parameter and is mainly used to controlthe range of variation of the convergence factor Thereforewe selected four different values for numerical experimentsto analyze the effect of nonlinear adjustment coefficients onthe performance of the algorithm The four different valuesare 051152 and the results of the comparison are shown inTable 6 Among them black bold represents the best result ofsolving in several algorithms

In general the value of the nonlinear adjustment coef-ficient 119896 has no significant effect on the performance ofthe HGGWA algorithm On closer inspection one can seethat besides the function 1198913 the optimization performanceof the algorithm is optimal when the nonlinear adjustmentcoefficient 119896 is 05 But for function 1198913 119896 = 15 is the bestchoice For functions 1198917 and 1198919 the HGGWA algorithmobtains the optimal solution 0 in the case of several differentvalues of the coefficient 119896 Therefore for most test functionsthe optimal value of the nonlinear adjustment factor 119896 is 0555 Global Convergence Analysis of HGGWA In theHGGWA algorithm the following four improved strategiesare used to ensure the global convergence of the algorithm(1) initial population strategy based on Opposition-BasedLearning (2) selection operation based on optimal retention(3) crossover operation based on population partitioningmechanism and (4) mutation operation for elite individ-uals

The idea of solving the high-dimensional function opti-mization problem byHGGWAalgorithm is as follows Firstlyfor the initial population strategy the basic GWO generatesthe initial population in a random manner However it willhave a great impact on the search efficiency of the algo-rithm for high-dimensional function optimization problemsThe algorithm cannot effectively search the entire problemsolution space if the initial grey wolf population is clusteredin a small range Therefore the Opposition-Based Learningstrategy is used to initialize the population which can makethe initial population evenly distributed in the solution spacethus laying a good foundation for the global search of thealgorithm Secondly the optimal retention strategy is usedto preserve the optimal individual of the parent population

in the process of population evolution As a result the nextgeneration of population can evolve in the optimal directionIn addition the individuals with higher fitness are selectedby the gambling roulette operation which maintains thedominance relationship between individuals in the popula-tion This dominance relationship makes the algorithm withgood global convergence Thirdly the crossover operation iscompleted under the condition of population partitioningin order to achieve the purpose of dimension reductionand maintain the diversity of the population Finally thesearch direction of the grey wolf population is guided by themutation operation of the elite individuals which effectivelyprevents the algorithm from falling into the local optimalsolution All in all through the improvement of these fourdifferent strategies the HGGWA algorithm shows goodglobal convergence in solving high-dimensional functionoptimization problems

6 Conclusion

In order to overcome the shortcomings of GWO algorithmfor solving large-scale global optimization problems whichare easy to fall into local optimum this paper proposes aHybrid Genetic GreyWolf Algorithm (HGGWA) by integrat-ing three genetic operators into the algorithm The improvedalgorithm initializes the population based on theOpposition-Based Learning strategy for improving the search efficiencyof the algorithm The strategy of population division basedon cooperative coevolution reduces the scale of the problemand the mutation operation of the elite individuals effectivelyprevents the algorithm from falling into the local optimaTheperformance of HGGWA has been evaluated using 10 classi-cal benchmark functions and 7 CECrsquo2008 high-dimensionalfunctions Fromour experimental results several conclusionscan be drawn

The results have shown that it is capable of groupinginteracting variables with great accuracy for the major-ity of the benchmark functions A comparative studyamong the HGGWA and other state-of-the-art algorithmswas conducted and the experimental results showed thatthe HGGWA algorithm exhibits better global convergencewhether it is solving separable functions or nonseparablefunctions(1)By testing 10 classical benchmark functions comparedwith the results of WOA SSA and ALO the HGGWA algo-rithm has been greatly improved in convergence accuracy Inaddition to the Schwefel problem and Rosenbrock functionthe HGGWA algorithm achieves an 80-100 successfulratio for all benchmark functions with 100 dimensionswhich proves the effectiveness of HGGWA for solving high-dimensional functions(2) By comparing with the optimization results on sevenCECrsquo2008 large-scale global optimization problems theresults show that HGGWA algorithm achieves the globaloptimal value for the separable functions F1 F4 and F6but the effect on other nonseparable problems is not verysatisfactory However HGGWAalgorithm still exhibits betterglobal convergence than the other algorithms CCPSO2DewSAcc MLCC and EPUS-PSO on most of the functions

Complexity 15

Table6Com

paris

onof

optim

izationperfo

rmance

ofdifferent119896val

uesfor

HGGWA

Functio

n119896=0

5119896=1

119896=15

119896=2Mean

Std

Mean

Std

Mean

Std

Mean

Std

119891 1843

E-60

536E-60

225E-54

203E-54

558E-53

431E-53

231E-51

316E-51

119891 2289

E-34

231E-34

206

E-33

189E

-33

470E-32

365E-32

576E-30

422E-30

119891 3111E+0

010

2E+0

0314E+

01219E+

01403E-01

238E-01

360

E+01

216E+

01119891 4

978

E+01

836E+

0198

2E+0

174

3E+0

198

1E+0

173

4E+0

198

3E+0

1461E+

01119891 5

218E-03

134E

-03

887E-02

667E-02

781E+0

0613E+

00174E

-02

234E-02

119891 613

1E-03

113E

-03

162E

-03

243E-04

299E-03

104E

-03

184E

-03

134E

-03

119891 7000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 815

8E-14

123E

-15

293E-14

137E

-14

222E-14

149E

-14

251E-14

108E

-14

119891 9000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 10299

E-02

249E-02

338E-02

124E

-03

842E-02

614E-02

729E

-02

813E-03

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 15: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

Complexity 15

Table6Com

paris

onof

optim

izationperfo

rmance

ofdifferent119896val

uesfor

HGGWA

Functio

n119896=0

5119896=1

119896=15

119896=2Mean

Std

Mean

Std

Mean

Std

Mean

Std

119891 1843

E-60

536E-60

225E-54

203E-54

558E-53

431E-53

231E-51

316E-51

119891 2289

E-34

231E-34

206

E-33

189E

-33

470E-32

365E-32

576E-30

422E-30

119891 3111E+0

010

2E+0

0314E+

01219E+

01403E-01

238E-01

360

E+01

216E+

01119891 4

978

E+01

836E+

0198

2E+0

174

3E+0

198

1E+0

173

4E+0

198

3E+0

1461E+

01119891 5

218E-03

134E

-03

887E-02

667E-02

781E+0

0613E+

00174E

-02

234E-02

119891 613

1E-03

113E

-03

162E

-03

243E-04

299E-03

104E

-03

184E

-03

134E

-03

119891 7000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 815

8E-14

123E

-15

293E-14

137E

-14

222E-14

149E

-14

251E-14

108E

-14

119891 9000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

000

E+00

119891 10299

E-02

249E-02

338E-02

124E

-03

842E-02

614E-02

729E

-02

813E-03

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 16: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

16 Complexity

(3) By analyzing the running time of several algorithmsthe results show that the convergence time of HGGWAalgorithmhas obvious advantages comparedwith SSAWOAetc algorithmswhich are proposed recently For the functions11989111198912 1198916119891711989181198919 and 11989110 the running time of the HGGWAalgorithm is kept between 1s and 3s under the specifiedconvergence accuracy which shows the excellent stability ofthe HGGWA(4) In the future we are planning to investigate moreefficient population partition mechanism to adapt to differentnonseparable problems in the process of crossover operationWe are also interested in applying HGGWA to real-worldproblems to ascertain its true potential as a valuable optimiza-tion technique for large-scale optimization such as the setupof large-scale multilayer sensor networks [73] in the Internetof Things

Data Availability

The data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Authorsrsquo Contributions

Qinghua Gu and Xuexian Li contributed equally to this work

Acknowledgments

This work was supported by National Natural Science Foun-dation of China (Grant Nos 51774228 and 51404182) NaturalScience Foundation of Shaanxi Province (2017JM5043) andFoundation of Shaanxi Educational Committee (17JK0425)

References

[1] R Zhang and CWu ldquoA hybrid approach to large-scale job shopschedulingrdquo Applied Intelligence vol 32 no 1 pp 47ndash59 2010

[2] M Gendreau and C D Tarantilis Solving large-scale vehiclerouting problems with time windows The state-of-the-art[M]Montreal Cirrelt 2010

[3] M B Liu S K Tso and Y Cheng ldquoAn extended nonlinearprimal-dual interior-point algorithm for reactive-power opti-mization of large-scale power systems with discrete controlvariablesrdquo IEEE Power Engineering Review vol 22 no 9 p 562002

[4] I Fister I Fister Jr J Brest and V Zumer ldquoMemetic artificialbee colony algorithm for large-scale global optimizationrdquo inProceedings of the IEEE Congress on Evolutionary Computation(CEC rsquo12) pp 1ndash8 IEEE Brisbane Australia June 2012

[5] A Zamuda J Brest B Bosovic and V Zumer ldquoLarge scaleglobal optimization using differential evolution with self-adap-tation and cooperative co-evolutionrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) IEEE June2008

[6] T Weise R Chiong and K Tang ldquoEvolutionary optimizationpitfalls and booby trapsrdquo Journal of Computer Science and Tech-nology vol 27 no 5 pp 907ndash936 2012

[7] L LafleurDescartes DiscourseOnMethod Pearson SchweizAgZug Switzerland 1956

[8] Y-S Ong M H Lim and X Chen ldquoMemetic Computa-tionmdashPast Present amp Future [Research Frontier]rdquo IEEE Com-putational Intelligence Magazine vol 5 no 2 pp 24ndash31 2010

[9] N Krasnogor and J Smith ldquoA tutorial for competent memeticalgorithms model taxonomy and design issuesrdquo IEEE Trans-actions on Evolutionary Computation vol 9 no 5 pp 474ndash4882005

[10] S Jiang M Lian C Lu Q Gu S Ruan and X Xie ldquoEnsembleprediction algorithm of anomaly monitoring based on big dataanalysis platform of open-pitmine sloperdquoComplexity vol 2018Article ID 1048756 13 pages 2018

[11] R S Rahnamayan H R Tizhoosh and M M A SalamaldquoOpposition-based differential evolutionrdquo IEEETransactions onEvolutionary Computation vol 12 no 1 pp 64ndash79 2008

[12] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf optimizerrdquoAdvances in Engineering Software vol 69 pp 46ndash61 2014

[13] E Daniel J Anitha K K Kamaleshwaran and I Rani ldquoOpti-mum spectrum mask based medical image fusion using graywolf optimizationrdquo Biomedical Signal Processing and Controlvol 34 pp 36ndash43 2017

[14] A A M El-Gaafary Y S Mohamed A M Hemeida and A-A A Mohamed ldquoGrey wolf optimization for multi input multioutput systemrdquo Universal Journal of Communications and Net-work vol 3 no 1 pp 1ndash6 2015

[15] G M Komaki and V Kayvanfar ldquoGrey wolf optimizer algo-rithm for the two-stage assembly flow shop scheduling problemwith release timerdquo Journal of Computational Science vol 8 pp109ndash120 2015

[16] S Jiang M Lian C Lu et al ldquoSVM-DS fusion based softfault detection and diagnosis in solar water heatersrdquo EnergyExploration amp Exploitation 2018

[17] Q Gu X Li C Lu et al ldquoHybrid genetic grey wolf algorithmfor high dimensional complex function optimizationrdquo Controland Decision pp 1ndash8 2019

[18] M A Potter and K A D Jong ldquoA cooperative coevolutionaryapproach to function optimizationrdquo in Parallel Problem Solvingfrom NaturemdashPPSN III vol 866 of Lecture Notes in ComputerScience pp 249ndash257 Springer Berlin Germany 1994

[19] M A PotterTheDesign and Analysis of a Computational Modelof Cooperative Coevolution George Mason University FairfaxVa USA 1997

[20] F van denBergh andA P Engelbrecht ldquoA cooperative approachto participle swam optimizationrdquo IEEE Transactions on Evolu-tionary Computation vol 8 no 3 pp 225ndash239 2004

[21] M El-Abd ldquoA cooperative approach to the artificial bee colonyalgorithmrdquo in Proceedings of the 2010 6th IEEE World Congresson Computational Intelligence WCCI 2010 - 2010 IEEE Congresson Evolutionary Computation CEC 2010 IEEE Spain July 2010

[22] Y Shi H Teng and Z Li ldquoCooperative co-evolutionary dif-ferential evolution for function optimizationrdquo in Proceedings ofthe 1st International Conference on Natural Computation (ICNCrsquo05) LectureNotes in Computer Science pp 1080ndash1088 August2005

[23] Z Yang K Tang and X Yao ldquoLarge scale evolutionary opti-mization using cooperative coevolutionrdquo Information Sciencesvol 178 no 15 pp 2985ndash2999 2008

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 17: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

Complexity 17

[24] X Li and X Yao ldquoTackling high dimensional nonseparableoptimization problems by cooperatively coevolving particleswarmsrdquo in Proceedings of the Congress on Evolutionary Compu-tation CEC rsquo9 pp 1546ndash1553 IEEE Computational IntelligenceMagazine Trondheim Norway May 2009

[25] M N Omidvar X Li Z Yang and X Yao ldquoCooperative co-evolution for large scale optimization through more frequentrandom groupingrdquo in Proceedings of the 2010 6th IEEE WorldCongress on Computational Intelligence WCCI 2010 - 2010 IEEECongress on Evolutionary Computation CEC 2010 Spain July2010

[26] TRay andXYao ldquoA cooperative coevolutionary algorithmwithcorrelation based adaptive variable partitioningrdquo in Proceedingsof the 2009 IEEE Congress on Evolutionary Computation (CEC)pp 983ndash989 IEEE Trondheim Norway May 2009

[27] W Chen T Weise Z Yang and K Tang ldquoLarge-scale globaloptimization using cooperative coevolution with variable inter-action learningrdquo in LectureNotes in Computer Science (IncludingSubseries Lecture Notes in Artificial Intelligence and LectureNotes in Bioinformatics) Preface vol 6239 pp 300ndash309 2010

[28] M N Omidvar X Li Y Mei and X Yao ldquoCooperative co-evolution with differential grouping for large scale optimiza-tionrdquo IEEE Transactions on Evolutionary Computation vol 18no 3 pp 378ndash393 2014

[29] M N Omidvar X Li and X Yao ldquoSmart use of computationalresources based on contribution for cooperative co-evolu-tionary algorithmsrdquo in Proceedings of the 13th Annual Geneticand Evolutionary Computation Conference GECCOrsquo11 pp 1115ndash1122 IEEE Ireland July 2011

[30] R Storn ldquoOn the usage of differential evolution for functionoptimizationrdquo in Proceedings of the Biennial Conference of theNorth American Fuzzy Information Processing Society (NAFIPSrsquo96) pp 519ndash523 June 1996

[31] X Li and X Yao ldquoCooperatively coevolving particle swarmsfor large scale optimizationrdquo IEEE Transactions on EvolutionaryComputation vol 16 no 2 pp 210ndash224 2012

[32] K Weicker and N Weicker ldquoOn the improvement of coevolu-tionary optimizers by learning variable interdependenciesrdquo inProceedings of the 1999 Congress on Evolutionary ComputationCEC 1999 pp 1627ndash1632 IEEE July 1999

[33] E Sayed D Essam and R Sarker ldquoUsing hybrid dependencyidentification with a memetic algorithm for large scale opti-mization problemsrdquo in Lecture Notes in Computer Science(Including Subseries Lecture Notes in Artificial Intelligence andLecture Notes in Bioinformatics) Preface vol 7673 pp 168ndash1772012

[34] Y Liu X Yao Q Zhao and T Higuchi ldquoScaling up fast evolu-tionary programming with cooperative coevolutionrdquo in Pro-ceedings of the 2001 Congress on Evolutionary Computation pp1101ndash1108 IEEE Seoul South Korea 2001

[35] E Sayed D Essam and R Sarker ldquoDependency identificationtechnique for large scale optimization problemsrdquo in Proceedingsof the 2012 IEEE Congress on Evolutionary Computation CEC2012 Australia June 2012

[36] Y Ren and Y Wu ldquoAn efficient algorithm for high-dimensionalfunction optimizationrdquo Soft Computing vol 17 no 6 pp 995ndash1004 2013

[37] J Liu and K Tang ldquoScaling up covariance matrix adaptationevolution strategy using cooperative coevolutionrdquo in Proceed-ings of the International Conference on Intelligent Data Engi-neering and Automated Learning ndash IDEAL 2013 vol 8206 of

LectureNotes in Computer Science pp 350ndash357 Springer BerlinGermany 2013

[38] Z Yang K Tang and X Yao ldquoMultilevel cooperative coevo-lution for large scale optimizationrdquo in Proceedings of the IEEECongress on Evolutionary Computation (CEC rsquo08) pp 1663ndash1670 June 2008

[39] M N Omidvar Y Mei and X Li ldquoEffective decompositionof large-scale separable continuous functions for cooperativeco-evolutionary algorithmsrdquo in Proceedings of the 2014 IEEECongress on Evolutionary Computation CEC 2014 pp 1305ndash1312 China July 2014

[40] S Mahdavi E M Shiri and S Rahnamayan ldquoCooperativeCo-evolution with a new decomposition method for large-scale optimizationrdquo in Proceedings of the 2014 IEEE Congress onEvolutionary Computation (CEC) 2014

[41] B Kazimipour MN Omidvar X Li andA K Qin ldquoA sensitiv-ity analysis of contribution-based cooperative co-evolutionaryalgorithmsrdquo in Proceedings of the IEEECongress on EvolutionaryComputation CEC 2015 pp 417ndash424 Japan May 2015

[42] Z Cao L Wang Y Shi et al ldquoAn effective cooperativecoevolution framework integrating global and local search forlarge scale optimization problemsrdquo in Proceedings of the IEEECongress on Evolutionary Computation CEC 2015 pp 1986ndash1993 Japan May 2015

[43] X Peng K Liu and Y Jin ldquoA dynamic optimization approachto the design of cooperative co-evolutionary algorithmsrdquoKnow-ledge-Based Systems vol 109 pp 174ndash186 2016

[44] H K Singh and T Ray ldquoDivide and conquer in coevolutiona difficult balancing actrdquo in Agent-Based Evolutionary Searchvol 5 of Adaptation Learning and Optimization pp 117ndash138Springer Berlin Germany 2010

[45] X Peng and Y Wu ldquoLarge-scale cooperative co-evolutionusing niching-based multi-modal optimization and adaptivefast clusteringrdquo Swarm amp Evolutionary Computation vol 352017

[46] M Yang M N Omidvar C Li et al ldquoEfficient resource allo-cation in cooperative co-evolution for large-scale global opti-mizationrdquo IEEE Transactions on Evolutionary Computation vol21 no 4 pp 493ndash505 2017

[47] M N Omidvar X Li and X Yao ldquoCooperative co-evolutionwith delta grouping for large scale non-separable functionoptimizationrdquo in Proceedings of the 2010 IEEE Congress onEvolutionary Computation (CEC 2010) pp 1ndash8 BarcelonaSpain July 2010

[48] X Peng and Y Wu ldquoEnhancing cooperative coevolution withselective multiple populations for large-scale global optimiza-tionrdquo Complexity vol 2018 Article ID 9267054 15 pages 2018

[49] S-T Hsieh T-Y Sun C-C Liu and S-J Tsai ldquoSolving largescale global optimization using improved particle swarm opti-mizerrdquo in Proceedings of the 2008 IEEECongress on EvolutionaryComputation CEC 2008 pp 1777ndash1784 China June 2008

[50] T Hatanaka T Korenaga N Kondo et al Search PerformanceImprovement for PSO in High Dimensional Space InTech 2009

[51] J Fan JWang andMHan ldquoCooperative coevolution for large-scale optimization based on kernel fuzzy clustering and variabletrust region methodsrdquo IEEE Transactions on Fuzzy Systems vol22 no 4 pp 829ndash839 2014

[52] A-R Hedar and A F Ali ldquoGenetic algorithm with popula-tion partitioning and space reduction for high dimensionalproblemsrdquo in Proceedings of the 2009 International Conferenceon Computer Engineering and Systems ICCESrsquo09 pp 151ndash156Egypt December 2009

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 18: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

18 Complexity

[53] J Q Zhang and A C Sanderson ldquoJADE adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[54] L M Hvattum and F Glover ldquoFinding local optima of high-dimensional functions using direct search methodsrdquo EuropeanJournal of Operational Research vol 195 no 1 pp 31ndash45 2009

[55] C Liu and B Li ldquoMemetic algorithm with adaptive local searchdepth for large scale global optimizationrdquo in Proceedings of the2014 IEEECongress on Evolutionary Computation CEC2014 pp82ndash88 China July 2014

[56] S Saremi S Z Mirjalili and S MMirjalili ldquoEvolutionary pop-ulation dynamics and grey wolf optimizerrdquo Neural Computingand Applications vol 26 no 5 pp 1257ndash1263 2015

[57] A Zhu C Xu Z Li J Wu and Z Liu ldquoHybridizing grey wolfoptimization with differential evolution for global optimizationand test scheduling for 3D stacked SoCrdquo Journal of SystemsEngineering and Electronics vol 26 no 2 pp 317ndash328 2015

[58] V Chahar and D Kumar ldquoAn astrophysics-inspired grey wolfalgorithm for numerical optimization and its application toengineering design problemsrdquo Advances in Engineering Soft-ware vol 112 pp 231ndash254 2017

[59] D Guha P K Roy and S Banerjee ldquoLoad frequency controlof large scale power system using quasi-oppositional grey wolfoptimization algorithmrdquoEngineering Science andTechnology anInternational Journal vol 19 no 4 pp 1693ndash1713 2016

[60] C Lu S Xiao X Li and L Gao ldquoAn effective multi-objectivediscrete grey wolf optimizer for a real-world scheduling prob-lem in welding productionrdquo Advances in Engineering Softwarevol 99 pp 161ndash176 2016

[61] E Emary H M Zawbaa and A E Hassanien ldquoBinary greywolf optimization approaches for feature selectionrdquoNeurocom-puting vol 172 pp 371ndash381 2016

[62] W Long J Jiao X Liang andM Tang ldquoInspired grey wolf opti-mizer for solving large-scale function optimization problemsrdquoApplied Mathematical Modelling Simulation and Computationfor Engineering and Environmental Systems vol 60 pp 112ndash1262018

[63] S Gupta and K Deep ldquoPerformance of grey wolf optimizer onlarge scale problemsrdquo in Proceedings of the Mathematical Sci-ences and Its Applications American Institute of Physics Con-ference Series Uttar Pradesh India 2017

[64] R L Haupt and S E Haupt ldquoPractical genetic algorithmsrdquoJournal of the American Statistical Association vol 100 no 100p 253 2005

[65] H R Tizhoosh ldquoOpposition-based learning a new scheme formachine intelligencerdquo in Proceedings of the International Con-ference on Computational Intelligence for Modelling Control andAutomation CIMCA and International Conference on IntelligentAgents Web Technologies and Internet Commerce (IAWTIC rsquo05)pp 695ndash701 Vienna Austria November 2005

[66] H Wang Z Wu S Rahnamayan Y Liu and M VentrescaldquoEnhancing particle swarm optimization using generalizedopposition-based learningrdquo Information Sciences vol 181 no20 pp 4699ndash4714 2011

[67] H Wang S Rahnamayan and ZWu ldquoParallel differential evo-lution with self-adapting control parameters and generalizedopposition-based learning for solving high-dimensional opti-mization problemsrdquo Journal of Parallel andDistributed Comput-ing vol 73 no 1 pp 62ndash73 2013

[68] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computation vol 3no 2 pp 82ndash102 1999

[69] A P Engelbrecht Fundamentals of Computational SwarmIntelligence John Wiley amp Sons 2006

[70] S Mirjalili and A Lewis ldquoThe whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[71] S Mirjalili A H Gandomi S Z Mirjalili S Saremi H Farisand S M Mirjalili ldquoSalp Swarm Algorithm A bio-inspiredoptimizer for engineering design problemsrdquo Advances in Engi-neering Software vol 114 pp 163ndash191 2017

[72] S Mirjalili ldquoThe ant lion optimizerrdquo Advances in EngineeringSoftware vol 83 pp 80ndash98 2015

[73] Q Gu S Jiang M Lian and C Lu ldquoHealth and safety situationawareness model and emergency management based on multi-sensor signal fusionrdquo IEEE Access vol 7 pp 958ndash968 2019

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom

Page 19: Hybrid Genetic Grey Wolf Algorithm for Large-Scale Global ...downloads.hindawi.com/journals/complexity/2019/2653512.pdf · metaheuristic algorithms such as genetic algorithm (GA),

Hindawiwwwhindawicom Volume 2018

MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Mathematical Problems in Engineering

Applied MathematicsJournal of

Hindawiwwwhindawicom Volume 2018

Probability and StatisticsHindawiwwwhindawicom Volume 2018

Journal of

Hindawiwwwhindawicom Volume 2018

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawiwwwhindawicom Volume 2018

OptimizationJournal of

Hindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom Volume 2018

Engineering Mathematics

International Journal of

Hindawiwwwhindawicom Volume 2018

Operations ResearchAdvances in

Journal of

Hindawiwwwhindawicom Volume 2018

Function SpacesAbstract and Applied AnalysisHindawiwwwhindawicom Volume 2018

International Journal of Mathematics and Mathematical Sciences

Hindawiwwwhindawicom Volume 2018

Hindawi Publishing Corporation httpwwwhindawicom Volume 2013Hindawiwwwhindawicom

The Scientific World Journal

Volume 2018

Hindawiwwwhindawicom Volume 2018Volume 2018

Numerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisNumerical AnalysisAdvances inAdvances in Discrete Dynamics in

Nature and SocietyHindawiwwwhindawicom Volume 2018

Hindawiwwwhindawicom

Dierential EquationsInternational Journal of

Volume 2018

Hindawiwwwhindawicom Volume 2018

Decision SciencesAdvances in

Hindawiwwwhindawicom Volume 2018

AnalysisInternational Journal of

Hindawiwwwhindawicom Volume 2018

Stochastic AnalysisInternational Journal of

Submit your manuscripts atwwwhindawicom