(paper) self adaptive island ga

8

Click here to load reader

Upload: naoki-shibata

Post on 11-May-2015

263 views

Category:

Technology


1 download

DESCRIPTION

Takashima, E.,Murata, Y., Shibata, N., Ito, M.: Self Adaptive Island GA, Proceedings of 2003 Congress on Evolutionary Computation (CEC 2003), Vol.2, pp.1072-1079, DOI:10.1109/CEC.2003.1299787http://ito-lab.naist.jp/themes/pdffiles/031209.eiichi-t.cec2003.pdfExploration efficiency of GAs largely depends on parameter values. But, it is hard to manually adjust these values. T o cope with this problem, several adaptive GAs which automatically adjust parameters have been proposed. However , most of the existing adaptive GAs can adapt only a few parameters at the same time. Al-though several adaptive GAs can adapt multiple param-eters simultaneously , these algorithms require extremely large computation costs. In this paper , we propose Self Adaptive Island GA(SAIGA) which adapts four param-eter values simultaneously while finding a solution to a problem. SAIGA is a kind of island GA, and it adapts parameter values using a similar mechanism to meta-GA. Throughout our evaluation experiments, we con-firmed that our algorithm outperforms a simple GA us-ing De Jong's rational parameters, and has performance close to a simple GA using manually tuned parameter values.

TRANSCRIPT

Page 1: (Paper) Self adaptive island GA

SelfAdaptive Island GAEiichi Takashima,Yoshihiro Murata, Naoki Shibata and Minoru Ito

GraduateSchoolof InformationScienceNaraInstituteof ScienceandTechnology

8916-5,Takayama,Ikoma,Nara630-0192,Japane-mail:

�eiichi-t, yosihi-m,n-sibata,ito � @is.aist-nara.ac.jp

Abstract- Exploration efficiencyof GAs largely dependson parameter values.But, it is hard to manually adjustthesevalues.To copewith this problem, several adaptiveGAs which automatically adjust parameters have beenproposed. However, most of the existing adaptive GAscan adapt only a few parameters at the sametime. Al-though several adaptiveGAs canadapt multiple param-eterssimultaneously, thesealgorithms requireextremelylargecomputation costs. In this paper, we proposeSelfAdaptive Island GA(SAIGA) which adaptsfour param-eter valuessimultaneouslywhile finding a solution to aproblem. SAIGA is a kind of island GA, and it adaptsparameter values using a similar mechanismto meta-GA. Thr oughout our evaluation experiments, we con-firmed that our algorithm outperforms a simple GA us-ing DeJong’srational parameters,and hasperformancecloseto a simple GA using manually tuned parametervalues.

1 Intr oduction

Geneticalgorithm(GA) is an approximationalgorithm forcombinatorial optimization problems inspired by evolu-tion mechanismsin nature. Explorationefficiency of GAslargely dependson parametervaluessuchascrossover rateand mutationrate. But, adjustingtheseparametervaluestakesa greatdealof time, sincetheoptimalparameterval-uesdependon theproblemto solve. Besidesit, theoptimalvalueof eachparameterdependson crossover methodandmutationmethod.To copewith this problem,severaladap-tive GAs which automaticallyadjustparametershave beenproposed[1,2, 3,4]. However, mostof theexistingadaptiveGAs canadaptonly a few parametersat thesametime. Al-thoughseveraladaptiveGAs canadaptmultipleparameterssimultaneously, thesealgorithmsrequire extremely largecomputationcosts.We have alreadyproposedanalgorithmcalled A-SAGA[5] which is a combinationof meta-GA[6]andGA with distributedenvironmentscheme[7],whereGAwith distributedenvironmentschemeis akind of islandGA.A-SAGA canadaptanycombinationsof parameterswhichareassignedto eachisland,althoughit only requiresdef-initions of a fitnessevaluationfunction andGA operators.A-SAGA requirestrainingbeforesolvinga problem.In thetraining phase,many problemsare solved. By using theresult of training, A-SAGA solves similar problemseffi-

ciently. But, if thereis only oneproblemto solve,A-SAGAis notalwaysefficientsinceit requiresextracostof training.

In this paper, we propose Self Adaptive IslandGA(SAIGA) which is basedon A-SAGA andworks effi-ciently even if thereis only oneproblemto solve. SAIGAadaptsparametervaluesusingasimilarmechanismto meta-GA, but it requiresno training. Throughoutour evaluationexperiments,weconfirmedthatouralgorithmoutperformsasimpleGA usingDe Jong’s rationalparameters[8],andhasperformancecloseto asimpleGA usingmanuallytunedpa-rametervalues.Wedescriberelatedworksin section2, pro-posedalgorithmin section3, experimentalresultandcon-siderationin section4, andconclusionin section5.

2 Relatedworks

F. G. Lobo et al. have proposedanadaptive GA which ef-fectively workswhenoptimalnumberof individualsis notknown[9]. In this method,parallelsearchesareperformedusingdifferentnumbersof individualssuchas16, 32, 64,andso on, expectingoneor moreof themwith appropri-atenumberof individualswould yield a goodresult. But,it is not realistic to performa largenumberof searchesinparallel. Accordingly, eachGA is madeto usea differentnumberof evaluationsper unit time so that a GA with asmallnumberof individualsusesa largernumberof evalu-ationsper unit time. In the casewherea GA with a smallnumberof individualscanfind a nearoptimalsolution,notsomuchnumberof evaluationsarewastedsinceGAs witha largenumberof individualsareonly assigneda relativelysmallnumberof evaluations.In thecasewhereonly a GAwith a largenumberof individualscanfind a nearoptimalsolution,GAs with a small numberof individualsaredis-continuedwhena GA with a largernumberof individualfindsa bettersolution.

Hinterdinget al. have proposedan adaptive GA whichrunsthreeGAs with thedifferentnumbersof individualsinparallel[10]. Theprocessof searchis divided into epochs.At eachepoch,fitnessvaluesof elite individualsarecom-pared,andthenumberof individualsarechangedaccordingto theresult. For example,if theGA with the largestnum-berof individualsyieldedthebestresult,all GAs will usealargernumberof individualsin thenextepoch.

Backhasproposedanadaptive GA[1] whoseindividualhasits own mutationrateencodedin its gene. Individuals

Page 2: (Paper) Self adaptive island GA

with goodmutationratesareexpectedto survive. However,sinceindividualswith high mutationratesdie in high prob-ability, only individualswith low mutationratestendto sur-vive in the last phaseof search. Actually, literature[11]pointsout that this algorithmshows only low performancein thelastphaseof search.

Espinoza,et al. have proposedanotheradaptive GA [2]whoseindividuals can independentlysearchsolutionsus-ing local search. In this algorithm, searchefficiency perunit numberof evaluationshasbeenimprovedby adaptingaratio betweencomputationcostsof crossover/mutationandlocalsearch.

An adaptiveGA proposedby Krink, et al.[3] determinescrossover rate andmutationrate of eachindividual by itslocationin 2-dimensionallatticespace.Individualsareex-pectedto move towardsa locationwith betterparameters.Thealgorithmkeepsdiversityof theseparametervaluesbylimiting thenumberof individualsin eachlattice.

Goldberg, et al. hasproposeda theoreticalway to com-puteeffective valuerangesof parameterswhenapplyingaGA to the onemax problem[12,13]. The derived rangesare depictedin a graphcalled control map. In literature[14], propertiesof several selectiontechniqueshave beenanalyzed. However, in order to derive detailedparametervaluesusingsuchananalyticapproach,propertiesof targetproblemsmustbe formulatedfrom scratch.This formula-tion maybedifficult dependingon theproblemto solve.

Meta-GA is a generalmethodwhich usesa GA to de-rive a goodsetof parametervalues(calledparameter vec-tor, hereafter)usedin other GAs for searchingsolutions.In meta-GA,sincethe numberof evaluationstendsto belarge, the wholecomputationcostsmustalsobehigh. Forexample,anordinaryGA with 100individualsand100gen-erationsrequires100 � 100 evaluations. To find a goodparametervectorfor thisGA usingameta-GAwith 10indi-vidualsand20 generations,thenumberof evaluationswillbe ������������������� ��������� � ����� � � .

Kee,et al. hasimprovedmeta-GAmethods[4]. In thismethod,a preliminarysearchis carriedout for trainingbe-fore applying the algorithm to the actualproblem. In thetrainingphase,statesof individualsareclassifiedinto sev-eral groupsdependingon given indices. Then, for eachgroup,searchefficiency is investigatedfor several tensofparametervectorsandtheparametervectorwith thehighestsearchefficiency is determined.Whensearchinga solutionto theactualproblem,it observesthestateof individualsandusestheparametervectorwith thehighestefficiency for thatstate.

IslandGA (IGA)[15] is a kind of parallelGAs. In IGA,eachGA is regardedasan island,andall islandsareexe-cutedin parallel.Someindividualsimmigrateto anotheris-landsasthesearchprogressesandin this way islandssharesomeinformationandsearchthesolutionin cooperation.

Tongchimet al. have proposedan adaptive GA whichadaptsmutation rate and crossover rate[16]. This GA isbasedon IGA. As thesearchprogresses,increasesin aver-agefitnessof eachislandarecomparedto thoseof neighborislands.If anincreaseof a neighborislandis larger, param-eter valuesare changedaccordingto that of the neighborisland.

Miki et al. have proposeda GA with distributedenvi-ronmentscheme[7].In thisGA, differentparametervectorsaregiven to islandsof IGA expectinggoodsolutionsto befoundin islandswith goodparametervectors.This is notanadaptiveGA, sinceparametershave to begivenmanually.

3 Proposedalgorithm

3.1 Overview

SAIGA usestwo layersof GAs. The lower layer is an is-landGA calledlow level GA whichconsistsof two or moreislands.Thesetof islandsin thelow level GA is denotedby�. Eachislandin

�searchesfor thesolutionto agivenprob-

lem. We placetheseislandsin ring topology. The upperlayer is a simpleGA calledhigh level GA which searchesfor a suitableparametervectorfor thelow level GA.

Theparametervectorcorrespondingto anisland � in�

is��� ����� � � � � � ! � ��" �$# , where � � , � � , ! � and " � denotepop-ulation size, tournamentsize, crossover rateandmutationraterespectively.

Our algorithmexecutesonegenerationof thehigh levelGA at every predefinednumberof evaluationsof the lowlevel GA. We call thesepredefinednumberof evaluationsera. Islandsindependentlysearchfor abettersolutionusingpredefinednumberof evaluationsat eachera.Forexample,if 100evaluationsareassignedto eachisland,and � � �%��is assignedto oneof the islands, &(' ') ' �+* generationsofsearchis performedduringthatera.

After eachera,fitnessvaluesaredeterminedfor all pa-rametervectors.In meta-GA,parametervectorsareevalu-atedby individual fitnessof the elite individual. But, thisevaluationmethoddoesnot work well with our algorithm,sincewhenoneof the immigrantshasbetterfitnessvaluethan the original elite individual, the fitnessvalue of theelite individual increases.Accordingly, parameterfitnessof our algorithmis given by cumulative increasesof indi-vidual fitnessof elite individual, exceptincreasecausedbyimmigration.

Strictly speaking,if populationsof two islandsaredif-ferent, the bestparametervectorsfor theseislandsmightalsobe different. But, sincethe islandsshareinformationby immigration,we regardthat the bestparametervectorsfor theseislandsaresame.

Using our algorithm,we neednot adjustthe parametervectorfor the low level GA, but still needto adjustthepa-

Page 3: (Paper) Self adaptive island GA

rametervaluesfor the high level GA. But, thereshouldbea parametervectorsuitablefor anykind of problems,sincethehigh level GA alwayssolvesa problemto find thebestparametervectorfor the low level GA, andthereis not somuchdifferencebetweentheseproblemsunlessthe setofparametersarechanged,even if the tasksfor the low levelGA aredifferent.

3.2 Detailed explanation

3.2.1Encoding of parameter vector for low level GA�-,� ���.,� ��!/,� and "0,� denotegenotypesfor population size,tournamentsize,crossover rateandmutationrate, respec-tively(figure1). Eachof the genotypesis representedby a16bit fixed point numberwhoserangeis between0 and1.The genotypesareconvertedto correspondingphenotypes� � ��� � ��! � and " � usingthefollowing formulas.

� � � 1��325456�7-��8925� ,� 25:<; =>�?� #�#?@BA (1)� � � C 1D� � 2E� ,� @ if � � 2�� ,�GF ���� if � � 2�� ,�GH � A (2)! � � ! ,� A (3)" � � � A � �� �B*32�4/6�7I��" ,� 2/:<; =>���KJ.� A � � ��� #�#LA (4)

ni ’ si ’ ci ’ mi ’

Figure1: Encodingof parametervectorfor low level GA

3.2.2Crossover and mutation methodsfor the high levelGA

We useuniform crossover asthe crossover methodfor thehigh level GA, provided that both of the target individu-als have populationsize more than 2. Otherwise, � ,� and� ,� arehandledasonegeneandnot separated.This is be-causewhen � � is 2 or less, � � is 2 regardlessof � ,� andthus�E,� is not evaluatedproperly.

Themutationoperatorof thehighlevelGA is asfollows.Mutationoperatorareappliedto every gene.For exam-

ple,mutationoperatorappliedto populationsizeis definedas � ,�NM � ,�PORQ �?����� A � # �whereQ �?����� A � # is agaussrandomnumberwith mean0 andvariance0.1.

3.2.3Pseudocode

Weusethefollowing constantsin our algorithm.

max eracount 2500evaluationcountper era 256high level crossover rate 0.8high level mutationrate 0.6S � S

(thenumberof islands) 10

Weusethefollowing notationin out algorithm.TVUis the setof individualsin the high level GA. W U>X ��Y

( Z T U) is a high level individual for island � . Each

island � in�

has 45[\:^]�\_�`<;Ba b/;B]cad_ 7>4�e 4Ee�\ individuals.W�f X ��Y X �gYh� A<A<A �DW>f X �?Y X 45[ \:^]�\_�`i;Ba b/;B]caj_ 7P4�e 4�e�\KY areglobalvari-ables,whereWcf X ��Y X k Y is the

k-th individual in island � . T f� is a

setof individualsin theisland � . l � is abuffer for receivingimmigrantsto island � . l � is globalvariable. m � will retainelite fitnessvalueof island � . nE! retainsthe numberof eracount. o � retainsfitnessvalueof the high level individualcorrespondingto island � . n�pd! retainsthenumberof evalu-ationperformedin this subroutine. mKq r5s retainsincreaseoftheelite fitness. m s$t5uDuDv$rLw retainstheelite fitnessin thenextgeneration.xKy�v$z{w is theelite individual.

Pseudocodeof theproposedalgorithmis asfollows.

Algorithm SAIGA1 begin2 for each|�}�~ do3 randomlygenerate���d� |D� ;4 randomlygenerate�d�h� |D����� ���L�������{�d�h� |D��� �L�E�5�i�j�5�$���E� � �E�B�� ���L��L�(�5� ;5 ��� := � ;6 � � := 0;7 next8 for �g� := 0 to max eracountdo9 // Eachindividual for high level GA correspondsto anis-

land.10 for each|�}�~ do11 �D  �� �h¡ � ��� ��¢ := Executeisland(   �� ,� � � |D� , � � ,| );/* Exe-

cuteoneeraof island.*/12 next13 Performrouletteselectionof individualsfor highlevelGA

usingfitnessvalue ¡ � . Theelite individual is conserved.14  £� := HighLevelCrossover(  ¤� );15   � := HighLevelMutation(   � );16 next17 end

Algorithm Executeisland(T f� , W UPX �?Y , m.¥ ¦ zDw , � )

1 begin2 �g§.� := 0;3 �/¨ © ª := 0;4 Let �{«-��¬E�$���$­ ¢ bephenotypesof ���d� |D� .5 while �g§K�¯® evaluationcountper erado6 Performcrossover to   �� with probability � perindividual.

Theelite individual is preserved.

Page 4: (Paper) Self adaptive island GA

7 Performmutationto  £�� with probability ­ perindividual.Theelite individual is preserved.

8 Substitutea sufficiently smallnumberfor �/ª�° ±^±^²�© ³ .9 for eachnewly generatedindividual ´µ} £�� do10 Calculatefitnessvalueof ´ andsubstitutethis value

for �/¶11 �g§K� := �g§K� + 1;12 if �/ª�°L±<±^²�©�³�®��/¶ then13 �5ª�°L±<±^²�©�³ := �g¶ ;14 ´E· ²�¸<³ := ´ ;15 endif16 next17 Performtournamentselectionwith size ¬ to   �� basedon

eachfitnessvalue � ¶ , andlet ¹$� � � |D�?� � ���L�������{� � � |D�?� «��{º bere-sult of theselection.Theelite individual is preserved.

18 � ¨ © ª := � ¨ © ª + � ª�°L±^±<²�©�³P» �5¼ ½ ¸<³ ;19 �5¼ ½ ¸^³ := �/ª�° ±^±^²�©�³ ;20 endwhile

/*Processemigration*/21 ¾ := emigration destination( | ); /* Function emigra-

tion destination( | ) returnsemigrationdestinationfrom island| . islandsareplacedin a ring topology.*/22 �À¿ := �À¿GÁ¹/´ · ²�¸<³(º

/*Receive immigrants*/23 if � �¯Âà � then24 Substitute� for theindividualwith highestfitnessin � � .25 if thefitnessvalueof �£ÄÅ� ¼ ½ ¸<³ then26 �5¼ ½ ¸^³ := thefitnessvalueof � ;27 Choosean individual from � � � |D�?� � � to � � � |D�?� «�� except

theelite individual,andsubstituteit for � .28 endif29 endif30 return �D  �� ��� ¨ ©Lª ���/¼ ½ ¸<³ ¢ ;31 end

Algorithm HighLevelCrossover(TVU

)1 begin2 for i := 1 to Æ � Ç�ÈKÉ ~�É�È high level crossover ratedo3 Generatea uniform randomnumberr, whereƵÊÌË3ÊÍ� .4 if ˵® 0.5 then5 swap(«�ÎÏ � �(«cÎÏ �<ÐPÑ );6 swap(¬ Î Ï � ��¬ Î Ï �<Ð>Ñ );7 endif8 if « Ï � Ä�Ò and « Ï �<Ð>Ñ Ä�Ò then9 Generateauniformrandomnumberr, whereƵÊÌËÓÊ� .10 if ËÓ® 0.5 then swap(¬ Î Ï � �(¬ Î Ï �<ÐPÑ ); endif11 endif12 Generatea uniformrandomnumberr, where Æ3Ê�Ë3ÊÔ� .13 if ËÓ® 0.5 then swap(��Î Ï � �h� Î Ï �iÐPÑ ); endif14 Generatea uniformrandomnumberr, where Æ3Ê�Ë3ÊÔ� .15 if Ë ® 0.5 then swap(­ÕÎÏ � �$­�ÎÏ �<Ð>Ñ ); endif�{«cÎÏ �<ÐPÑ ��¬ Î Ï �<Ð>Ñ �$� Î Ï �<ÐPÑ �(­�ÎÏ �<ÐPÑ ¢ .16 next17 end

Algorithm HighLevelMutation(T9U

)

1 Begin2 for each����� |{�d}� £� do3 Let �{«�Î���¬LÎ��h�LÎD�$­�Î ¢ begenotypesof � � � |{� .4 Generatea uniformrandomnumberr, where Æ3Ê�ËÓÊÍ� .5 if Ë3® high level mutationratethen6 «�Î := «�Î ÖØ×0�DÆ �(Æ ��� ¢ ;

/*Function ×Ù�DÆ �(Æ ��� ¢ returnsa gaussrandomnumberwith mean0 andvariance0.1*/

7 if «>Îc®�Æ then «cÎ := 0; endif8 if «>ÎcÄÔ� then «cÎ := 1; endif9 endif10 Generateauniform randomnumberr, where ƵÊØ˵ÊÍ� .11 if ËÓ® high level mutationratethen12 ¬gÎ := ¬ Î ÖØ×0�{Æ ��Æ ��� ¢ ;13 if ¬ Î ®�Æ then ¬ Î := 0; endif14 if ¬ Î ÄÍ� then ¬ Î := 1; endif15 ¬ := Ú^«�È/¬LÎ�Û ;16 if ¬¤®ÅÒ then ¬¤Ü Ã Ò ; endif17 endif18 Generateauniform randomnumberr, where ƵÊØ˵ÊÍ� .19 if ËÓ® high level mutationratethen20 � Î := � Î ÖØ×0�{Æ ��Æ ��� ¢ ;21 if � Î ®�Æ then � Î := 0; endif22 if � Î ÄÔ� then � Î := 1; endif23 endif24 Generateauniform randomnumberr, where ƵÊØ˵ÊÍ� .25 if ËÓ® high level mutationratethen26 ­�Î := ­�ÎÖÌ×Ù�DÆ �$Æ ��� ¢ ;27 if ­�ÎP®ÌÆ then ­�Î := 0; endif28 if ­ Î ÄR� then ­ Î := 1; endif29 endif30 Let � � � |D� be a chromosome with genotypes�^« Î ��¬ Î �$� Î �$­ Î ¢ .31 next32 end

4 Evaluation

4.1 Overview

For evaluation,we applyour algorithmto Traveling Sales-man Problem (TSP), deceptive problem, minimizationproblemsof RastriginfunctionandGriewankfunction. Wecompareour algorithmto a simpleGA usingDe Jong’s ra-tional parameters[8], wheremutationrateis 0.001,popu-lation sizeis 50, crossover rateis 0.6 andtournamentsizeis 2. We alsocompareour algorithmto a simpleGA usingmanuallytunedparametervalues.

4.2 Problems

4.2.1Minimization problem of Rastrigin and Griewankfunction

WeuseRastrigin(equation5) andGriewankfunction(equa-tion 6) with 30 variableswhereeachvariableis encodedbyGray code. We useone point crossover as the low level

Page 5: (Paper) Self adaptive island GA

0

5e-07

1e-06

1.5e-06

2e-06

2.5e-06

3e-06

3.5e-06

4e-06

0 0.1 0.2 0.3 0.4 0.5 0.6

fitne

ss

number of evaluations (*1.0e+06)

SAIGArational

manually tuned

Figure 2: Searchefficiency on minimization problem ofRastriginfunction

crossoveroperatorandbit reverseasthelow level mutationoperator.

m���Ý # �ßÞ A �µ2 QàOâáã �^ä &æå Ý)�æç Þ A �925b/;Bè5�h�KéIÝ � #$ê � (5)ç � A �ëB8 H Ý �Gì � A �KëB8 A

m���Ý # �%� Oíáã �^ä &î Ý )�ëB�� ��ï ç+áð�^ä &

î b/;Bè î Ý �ñ � ïæï � (6)ç *��E� H Ý �òì *d�E� A4.2.2Deceptiveproblem

Weusetheproblemwhich is aconcatenationof 4 deceptivefunctionswith an 8bit variableeach,shown asexpression7 where oP��ó # is thenumberof 1’s in ó expressedin binaryform. Eachvariableis encodedin binary.

m���oP��ó #�# � C � if oP��ó # �ß���ô ç oP��ó # if oP��ó #æõ��� A (7)

We useonepoint crossover as the low level crossoveroperatorandbit reverseasthelow level mutationoperator.

4.2.3Traveling salespersonproblem

We apply our algorithmto lin 105 problem[17]which has105 cities. The optimal solutionto this problemis 14379.We use2opt methodasa mutationoperator. Whetherthemutationoperatoris appliedor not is determinedfor eachcity. WeuseEXX[18] asacrossoveroperator.

4.3 Search efficiency

The resultsfor searchefficiency of TSP, deceptive prob-lem, the minimizationproblemsof Rastriginfunction and

0

0.05

0.1

0.15

0.2

0 1 2 3 4 5 6

fitne

ss

number of evaluations (*1.0e+06)

SAIGArational

manually tuned

Figure 3: Searchefficiency on minimization problem ofGriewankfunction

0

1

2

3

4

5

0 1 2 3 4 5 6

fitn

ess

number of evaluations (*1.0e+06)

SAIGArational

manually tuned

Figure4: Searchefficiency ondeceptiveproblem

Griewank function areshown in figure 2 , 3 , 4 and5 re-spectively.

Theverticalaxis representsfitnessvalueof theelite in-dividual, andthe horizontalaxis representsthe numberofevaluations.Eachof theresultsis theaverageof 300trials.

At the beginningof the search,our algorithm tendstobeoutperformedby a simpleGA usingtherationalparam-eters,but eventuallyour algorithmoutperformsthe simpleGA. This is becauseour algorithm usesrandomlyinitial-izedparametervaluesat first andthusits searchefficiencyis low, but searchefficiency becomeshigh as the param-etervectorconvergestowardstheoptimalvalue. Also, ouralgorithmhasperformancecloseto asimpleGA usingman-ually tunedparametervalues.Similarly, theperformanceofour algorithmis closeto that of a simpleGA usingmanu-ally tunedparametersonTSP, theminimizationproblemsofRastriginfunctionandGriewankfunction.

On thedeceptive problem,bothof thesimpleGAs stickafterfindinga localoptimawhosefitnessvalueis 4. Wecanseethattheaverageof thefitnessvaluesafterconvergenceisalittle lessthan4. Thisisbecauseweusetheproblemwhichis aconcatenationof 4 independentdeceptiveproblems,andrarely optimal solutionsarefound for someof theseprob-lems. Our algorithmescapesthe local optima,sinceit has

Page 6: (Paper) Self adaptive island GA

15000

20000

25000

30000

35000

0 1 2 3 4 5 6

fitne

ss

number of evaluations (*1.0e+06)

SAIGArational

manually tuned

Figure5: Searchefficiency on TSP

0

10

20

30

40

50

60

70

80

1e-141e-121e-101e-081e-060.00010.011100

popu

latio

n si

ze

fitness

averagestandard deviation

median

Figure6: Transitionof populationsize � onRastriginfunc-tion

thecharacteristicof IGA.As shown in figure 2, the performanceof our algo-

rithm whenappliedto the minimizationproblemof Rast-rigin function is similar to a simpleGA usingthe rationalparameters.

4.4 Transition of adaptedparameter values

Processesof parameteradaptationsfor eachproblemarecontrastedasfollows. As of theproblemof Rastriginfunc-tion, populationsize,mutationrate,crossover rateandtour-namentsizeareshown in figure6, 8,10and12respectively.Asof TSP, populationsize,mutationrate,crossoverrateandtournamentsizeareshown in figure7, 9, 11 and13 respec-tively. The vertical axis representseachparametervalue,and the horizontalaxis representsfitnessvalue. In thesefigures,average,medianandstandarddeviation of elite pa-rametervalueof 300trials areshown.

We can seethat populationsize convergesto differentvaluesfor eachproblem. Thus, we considerthat SAIGAadaptsthe populationsize to eachproblemsuitably. Thisseemsto bebecausethereis differencebetweendegreesofdiversityrequiredfor eachproblem.As diversityin thepop-ulation becomeslittle, the populationis likely to stick ina local optima,althoughthe efficiency of local searchbe-

0

10

20

30

40

50

60

70

80

20000250003000035000400004500050000

popu

latio

n si

ze

fitness

averagestandard deviation

median

Figure7: Transitionof populationsize � on TSP

0

0.2

0.4

0.6

0.8

1

1e-141e-121e-101e-081e-060.00010.011100

rate

fitness

averagestandard deviation

median

Figure8: Transitionof crossover rate ! on Rastriginfunc-tion

comeshigh.As of TSP, the averagepopulationsizedecreasesto 15

momentary, but the averageelite populationsizewhenfit-nessvalueis around20000is largerthan30,andthisseemsto bebecauseouralgorithmis trying to increasingdiversity.

As of the minimization problemof Rastriginfunction,populationsizedecreaseson theearlystagesof thesearch.This seemsto be becauselocal searchis effective on theearlystagesof thesearch.This is alsoseenin thetransitionof parameterson theproblemof Griewankfunction.

As of the problemof Rastriginfunction, mutationratedecreasesas the searchproceeds. This seemsto be be-causeit finds a bettersolutionasthe searchproceeds,andlocal searchbecomeseffective becauseof the propertyoftheproblem. In contrast,asof TSP, mutationratedoesnotchangefrom 0.02. With this rate,two 2opt operationsareexpectedto be appliedfor oneindividual in onelow levelgeneration.

As of TSP, crossover ratedoesnot changeat thebegin-ning of search. But, it gradually increasesas the searchproceeds.This seemsto be becauseother parameterval-ueshave larger influenceson searchefficiency at the be-ginning,but influencesof crossover ratebecomeshigh at alater stage. We considerthat SAIGA successfullyadaptscrossover rate. As of the problemof Rastrigin function,

Page 7: (Paper) Self adaptive island GA

0

0.2

0.4

0.6

0.8

1

20000250003000035000400004500050000

rate

fitness

averagestandard deviation

median

Figure9: Transitionof crossover rate ! on TSP

0.0001

0.001

0.01

0.1

1

1e-141e-121e-101e-081e-060.00010.011100

rate

fitness

averagestandard deviation

median

Figure10: Transitionof mutationrate " onRastriginfunc-tion

crossover rateis alwaysaround0.6. But, we considerthatSAIGA successfullyadaptscrossover rate sinceif it failsandcrossoverratechangesrandomly, theaverageshouldbe0.5.

As of TSPandthe problemof Rastriginfunction, tour-namentsizeratio � ,� doesnot changefrom around0.5. Thisvalueis closeto the averagevaluewhere � ,� randomlyde-cided,andthis might suggestthatour algorithmwould failto adaptthetournamentsize.However, we observedtransi-tion of � ,� when � ,� is initialized to 1.0e-5,and � ,� convergedto 0.5afterfitnessvaluebecomes21500or lower. Thus,weconsiderthatSAIGA successfullyadaptstournamentsize.

On figure 14, distributionsof adaptedvaluesof � ,� aregivenwhenfitnessvaluesarearound40000and19500.Wecanseethat whenfitnessvalueis around40000,the valueof � ,� is mostly0.7 to 0.8,andwhenfitnessvalueis around19500,the valueof � ,� is mostly 0.2 to 0.3. This seemstobebecausesearchefficiency becomeshigh whenselectionpressureis high at thebeginningof search.But diversityofpopulationis requiredafter the fitnessvaluebecomeslessthan25000,andthustournamentsizedecreases.

0.0001

0.001

0.01

0.1

1

20000250003000035000400004500050000

rate

fitness

averagestandard deviation

median

Figure11: Transitionof mutationrate " on TSP

0

0.2

0.4

0.6

0.8

1

1e-141e-121e-101e-081e-060.00010.011100

rate

fitness

averagestandard deviation

median0.5

Figure12: Transitionof rateof tournamentsize � , on Rast-rigin function

5 Conclusions

We proposeda self adaptive IslandGA which adapts4 pa-rametervaluessimultaneouslyby giving differentparame-ter valuesto eachislandandobservingsearchefficiencies.Throughoutour evaluationexperiments,we confirmedthatouralgorithmoutperformsasimpleGA usingDeJong’s ra-tional parameters,and hasperformancecloseto a simpleGA usingmanuallytunedparametervalues.Also, we con-firmedthatSAIGA successfullyadaptseachparameterval-ues.

In thefuturework,wewouldlike to improvehandlingofparametervaluesfor the low level GA whenthe low levelGA is stuckin a local optima.

Bibliography

[1] Back, T. Self-adaptationin genetic algorithms. InF.J.Varela,P. B., editor, Proceedingsof 1st EuropeanConferenceonArtificial Life, pp.263–271,(1992).

[2] Espinoza,F., Minsker, B. S. andGoldberg, D. A Self-Adaptive Hybrid GeneticAlgorithm. ProceedingsoftheGeneticandEvolutionaryComputationConference,SanFrancisco,MorganKaufmannPublishers,p. 759,(2001).

Page 8: (Paper) Self adaptive island GA

0

0.2

0.4

0.6

0.8

1

20000250003000035000400004500050000

rate

fitness

averagestandard deviation

median0.5

Figure13: Transitionof rateof tournamentsize � , on TSP

0

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0.16

0.18

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

freq

uenc

y

rate

4000019500

Figure14: Distribution of adaptedvalueof rateof tourna-mentsize � , onTSP

[3] Krink, T. and Ursem, R. K. ParameterControl Us-ing theAgentBasedPatchwork Model.ProceedingsoftheCongressonEvolutionaryComputation,pp.77–83,(2000).

[4] Kee, E., Airey, S. and Cye, W. An Adaptive GeneticAlgorithm. Proceedingsof the GeneticandEvolution-aryComputationConference,pp.391–397,(2001).

[5] Murata,Y. Shibata,N. Yasumoto,K. andIto, M. AgentOrientedSelf Adaptive GeneticAlgorithm. Communi-cationsandcomputernetworks(CCN).November4-6,2002cambridge,USA, pp.348–353,(2002).

[6] Weinberg, R. Computersimulation of a living cell.DissertationsAbstracts International, 31(9), 5312B,(1970).

[7] Miki, M. Hiroyasu,K. Kaneko,M. and Hatanaka,I.A ParallelGeneticAlgorithm with DistributedEnviron-mentScheme.IEEE Proceedingsof Systems,Man andCyberneticsConferenceSMC’99,pp.695–700,(1999).

[8] DeJong,K, A. An analysisof thebehaviorof aclassofgeneticadaptive systems.Ph. D. University of Michi-gan,Ann Arbor, (1975).

[9] F. G. Lobo. The Parameter-Less Genetic Algo-rithm:RationalandAutomatedParameterSelectionforSimplified GeneticAlgorithm Operation.PhD thesis,Universityof Lisbon,Portugal,(2000).

[10] Hinterding,R., Michalewicz, Z. andT. C. Peachey. R.Hinterding, Z. Michalewicz, and T. C. Peachey. Self-adaptive geneticalgorithmfor numericfunctions.Pro-ceedingsof the 4th Conferenceon Parallel ProblemSolvingfrom Nature,pp.420–429,(1996).

[11] Glicman,M. R. andSycara,K. Reasonsfor PrematureConvergenceof Self-Adapting Mutation Rates.Pro-ceedingsof theCongressonEvolutionaryComputation,pp.62–69,(2000).

[12] Goldberg,D. Sizing populationsfor serial and par-allel geneticalgorithms, In J. Davis Schaffer, editor,Proceedingsof the Third InternationalConferenceonGeneticAlgorithms,MorganKaufmannPublishers,pp.70–79,(1989).

[13] Goldberg,D., Deb, K. andDirk, T. Toward a betterunderstandingof mixing in geneticalgorithms.Jour-nalof theSocietyof InstrumentandControlEngineers,32(1):pp.10–16,(1993).

[14] Goldberg,D. andDeb, K. A comparative analysisofselectionschemesusedin geneticalgorithms.In Gre-gory J.E.Rawlins,editor, Foundationsof GeneticAl-gorithms, Morgan Kaufmann Publishers,pp. 69–93,(1991).

[15] Tanese,R. Distributed GeneticAlgorithms. In Pro-ceedingsof theThird InternationalConferenceon Ge-neticAlgorithms,pp.434–439,(1989).

[16] Tongchim,S.andChongstitvatana,P. Parallelgeneticalgorithmwith parameteradaptation,InformationPro-cessingLetters,Volume82, Issue1,pp.47–54,(2002).

[17] http://www.iwr.uni-heidelberg.de/groups/comopt/software/TSPLIB95/

[18] Maekawa, K. Mori, N. Tamaki, H. Kita H. andNishikawa,H. GeneticSolutionfor theTravelingSales-manProblemby Meansof a ThermodynamicalSelec-tion Rule. Proceedings1996 IEEE InternationalCon-ference on Evolutionary Computation, pp.529–534,(1996).