dowload paper.doc.doc

27
Animating The Inanimate Genetic Algorithms In Complex Problem Space Thorolf Horn Tonjum. School of Computing and Technology, University of Sunderland, The Informatics Centre.

Upload: butest

Post on 05-Dec-2014

361 views

Category:

Documents


0 download

DESCRIPTION

 

TRANSCRIPT

Page 1: Dowload Paper.doc.doc

Animating The Inanimate

Genetic AlgorithmsIn Complex Problem Space

Thorolf Horn Tonjum.School of Computing and Technology,

University of Sunderland, The Informatics Centre.

Page 2: Dowload Paper.doc.doc

Introduction.

This paper describes a learning scenario, where a creature with tree legs learns to walk.

The aim of this project is to see the genetic algorithm converge in a problem domain of high complexity.

To produce a learning scenario of suitable complexity, we use a physics engine to simulate a 3d world, in which we place a learning agent with 3 legs with 2 joints each. Movement becomes a functional of the functions of each limb, with respect to speed, gravity, weight, position, and rotation.

The Tripod is a learning agent; it is an autonomous entity, which generates its actions from definitions expressed in its own gene structure. The Tripod learns from experience. Learning is the ability to perform better by using experience to alter ones behaviour.The experience is produced by the agent experimenting with random movements, while getting graded by a fitness function (tutor).

The fitness function grades the Tripod based on the distance the Tripod is able to travel from the initial starting coordinate, within a period of 30 seconds. The learning data is the patterns of movements generated by the Tripod, together with the result of the fitness function. The Tripod learns from its experience by using a genetic algorithm. The genetic algorithm is a machine learning algorithm from the class of random heuristic search based algorithms.

The genetic algorithm.

Page 3: Dowload Paper.doc.doc

Genetic algorithms are widely used in machine learning; they are applicable in situations where specialised learning algorithms are not available, typically to situations where the underlying function is not known. As this is the case in our learning problem the genetic algorithm is suitable. Suitable but not perfect, as genetic algorithms are machine time intensive, and does not guaranty an optimal, not even a pareto-optimal solution, as long as the search space is not completely searchable. (A pareto-optimal solution is a solution in the complete hypothesis space where any sub part of the pareto-optimal solution is not inferior to any sub part of any other solution).

Random heuristic search based algorithms, like evolutionary algorithms, simulated annealing, and tabu search, are problem independent optimization algorithms. They are robust, signifying good, but not optimal behaviour for many problems.

One of the major drawbacks of using a simple genetic algorithm (SGA) is the lack of underlying mathematical theory.

“Analysis reveals that an exact mathematical analysis of SGA is possible only forsmall problems. For a binary problem of size n the exact analysis needs the computation of 2^n equations.” [Heinz Muhlenbein, 1993. "Mathematical Analysis of Evolutionary Algorithms for Optimization"]

One of the reasons for the inherit difficulties of mathematical analysis of genetic algorithms is that formulating a coherent mathematical model of the genetic algorithm would have to encompass the varying effects of the different and sometimes dynamic mixes of the probabilistic genetic operators; mutation, recombination, crossover, and selection scheme.

The consequence of this is that we are currently incapable of making mathematical sounda priori estimations, thus we can not answer questions like:

How likely is it that the n’th iteration of the algorithm will find the optimum? What is the approximate time until a global optimum is found.How many generations must we wait before finding a solution that fits within a certain performance confidence interval?How much variance is there in the measures from run to run?How much are the results affected by changes in the genetic parameters, like population size, mutation rates, selection schemes, and crossover rates and types?

Defining convergence for genetic algorithms is not straight forward. Because genetic algorithms are parallel stochastic search procedures, the results will be stochastic and unpredictable.

A suitable definition of convergence for genetic algorithms is: “A SGA will normally produce populations with consecutively higher fitness until the fitness increase stops at an optimum. Whether this is a local or global optimum,can not determined, unless one knows the underlying functions of the problem space.”

“Normaly” signifies an algorithm with properly tuned genetic parameters.

Then how can one properly tune the genetic parameters?The answer is trough trial and error, by tuning the parameters and observing theconvergence rates, and variance in the results, one can develop a hands on understanding of the problem and thereby find good parameters. Finding good parameters involves producing good mixture between exploratory pressures of mutation and crossover,and exploitative pressure of selection. To much exploitation leads to premature convergence (crowding). To little exploitation reduce the algorithm to pure random search.

Due to the lack of underlying mathematical theory, it is has not yet been developed theory to compute the Vapnic Chervonenkis dimension, or other performance boundary theories. Consequently convergence rates can not be estimated. Therefore confidence intervals for determination of the probability of reaching pareto optimal results within certain number of iterations is not establishable. Likewise, theory for defining halting criteria, for knowing when continuing search is futile, is also lacking. As a consequence there exist no algorithm to determine whether the search converge to local or global optimum.

Nevertheless genetic algorithms have a remarkable ability to escape local optimums and converge unhindered towards the global optimum [Ralf Salomon 1995. “Evaluating Genetic Algorithm performance” Elsevier Science].

Specialised genetic algorithm implementations, as opposed to the simple genetic algorithm (SGA), solve one or more of the theoretical lacks, discussed above, but the specialisations restricts the algorithm, modifies the algorithm, or imposes certain assumptions on the fitness landscapes, the search space, or the fitness function, in ways not compatible with our model.

Tree major approaches to standard performance measures for genetic algorithmshave been undertaken; linkage, evolutionary multiobjective optimization, and genetic algorithms for function optimization.But none of these are compatible with our model

Page 4: Dowload Paper.doc.doc

Linkage.Holland (Holland, 1975) suggested that operators learning linkage information to recombine alleles might be necessary for genetic algorithm success. Linkage is the concept of chromosome sub parts that have to stick together to secure transfer of high fitness levels to the offspring. If linkage exists between two genes, recombination might result in low fitness if those two genes are not transferred together. A group of highly linked genes forms a linkage group, or a building block.An analytical model of time to convergence can then be derived from the linkage model.

To map out the linkages in our model, is not trivial, as our model has nondeterministic characteristics. It would be possible to make a nondeterministic probabilistic mapping of the linkage groups, but this could be detrimental to the mathematical model of time to convergence. It is not clear how this model should be modified to encompass nondeterministic linkage groups.

[Goldberg, Yassine, & Chen, 2003. “DSM clustering in DSMDGA” ] & [Goldberg & Miller, 1997. “The genetic ordering in LLGA” ]

Evolutionary Multiobjective Optimization.Evolutionary multiobjective optimization computes convergence bounds by assuming that a pareto optimal set can be defined. No pareto optimal set can be defined in our model as the underlying functions are not measurable nor approximatable to a degree of accuracy where pareto optimal sets are definable.

Genetic Algorithms for Function Optimization.The theoretical understandings of the properties of genetic algorithms being used for function optimization depend on transient markov chain analysis, and rely on the presence of a fitness function. In our model the fitness function is not mathematically definable in terms if the control input (limb movements), as the fitness function is merely a sample of travelling distance.

Inductive bias.The inductive bias of the genetic algorithm is a result of the implementation in question, determiners are: the fitness function, the representation of the problem as a search space, the probabilistic genetic operators (selection, crossover, and mutation), and the size of the population.

System description.The Tripod is a construct with a body and tree legs, each leg has two joints. The Tripod lives in a OpenGl based 3d world simulation, modelled with correct physics, including gravity, friction, time, force, and velocity. The 3d world simulator (Breve) and the programming language (Steve) are made by Jon Klein.

The problem domain.

* The definition of the problem space refers to the Tripod_B edition. See user guide.

The Tripod agent produces many different movement patterns, this is how the agent learns, it learns by trial and error. Each movement pattern can be seen as one hypothesis out of the vast hypothesis space of all possible movement patterns. This space is the search space for discovery of optimal movement patterns.

The hypothesis space is : 630^6 * 500^6 + 200^6 = 9.769 * 10^32.The Tripod has tree legs with 2 joints each, each joint is operated by 3 variables.The 3 variables are joint angle [-/+ 360], Local velocity of joint movement [-/+ 200] and global velocity of joint movement [-\+ 250]. These are represented as floats with 6 digits precision in the program, 3.149879 as an example of the angle variable. But only the first two digits has certain effect.

The problem space is dependent on a physics simulation which employs a random number generator,and produces rounding errors. The simulation behaves in stochastic ways that can not be predicted, a specific action can cause more than one result, thus our problem domain has deterministic characteristics.

The effect of a movement is dependent of the previous state space; therefore it can be seen as a transitional markow chain.

Some movements are selfconflicting thus the problem space is also disjunct.

The underlying functions are not known, ergo we deal with a black box system.

Many important parameters are not supplied to the algorithm, like angle to the ground,the speed and direction of the tripod itself.

The problem space can be classified as a multiobjective optimization problem,

Page 5: Dowload Paper.doc.doc

where several interdependent functions effects the result.

Following that this a complex problem domain (defined under), the genetic algorithm seems to be a valid choice, as it is classified as a general robust learning arlgorithm (meaning it will work on nearly all learning problems), the "No Free Lunch" theorem [Wolpert & Macready. 1995] clearly states that a problem domain spesific algorithm will be preferable, but in our instance no such algorithm is available, as the underlying functions are not known, the problem domain is non deterministic, and this problem domain combines time dependency (transitional markow chains) with multiobjective optimization complexity.

This problem domain can be classified as a NP-hard search problem.Reason: If we continued to ad joints to the legs, the algorithm execution time would increase by more than O(n^k) where k is a constant and n is the complexity. Example: if n where the number of joints, K would not be constant. if you added a joint to each leg n=n+3, the search space would increase by a factor of 10^17.

From 630^6 * 500^6 + 200^6 = 9.769 * 10^32 to 630^9 * 500^9 + 200^9 = 3.053 * 10^49

In other words the growth rate of the search space is a function of the number ofthe input instance, in our example this function does not grow linearly, but exponentially with the input instances.

Complex. A conceptual whole made up of sub parts that relate in intricate ways, not readily analyzable.

NP-hardDefinition: The complexity class of decision problems that are intrinsically harder than those that can be solved by a nondeterministic Turing machine in polynomial time. When a decision version of a combinatorial optimization problem is proved to belong to the class of NP-complete problems, which includes well-known problems such as the travelling salesman problem, the bin packing problem, etc.

Polynomial timeDefinition: When the execution time of a computation, m(n), is no more than a polynomial function of the problem size, n. More formally m(n) = O(n^k) where k is a constant.

Nondeterministic Turing machineDefinition: A Turing machine which has more than one next state for some combinations of contents of the current cell and current state.

GP-hard.Our problem domain can be defined as a GP-hard, because it can be defined as a moving needle in the haystack problem, this is soBecause the optimal movement is a result of the previous domain state combined with stochastics, thus it will have nondeterministic properties and differ as a function of time / domain state. Definition of GP-hard : A problem domain where no choice ofgenetic algorithm properties, representations, and operators exist that make such a problem easy for a genetic algorithm to solve.

Implementation.

The learning algorithm.The genetic algorithm uses trial and error in intelligent ways, forming emergent patterns in global hyperspace, by actions in local hyperspace. Hyperspace is the space of all possible permutations of the control variables (the variables that compose the hypothesis), in our example we have 13 elements and our hyperspace is therefore of 13 dimensions.

The Fitness function.The fitness function evaluates the result of 3 legs moving concurrently,each leg movement affects the effect of the other legs movements, thus the moving Tripod is a system of nonlinear effects amid the legs, this is iterative trough time, therefore our system can be describes as a nonlinear dynamical system.

Because the physics engine that does the simulation, works in stochastic ways, the exact same movement of the legs from the exact same velocity and direction state, is not guarantied to produce the exact same effect. One reason for the stochastic is rounding errors, another is the timing functions, and a third is the random number generator.

As there exist no determinable underlying function or functional, and the system is obviously to complex to approximate, the best way to produce a fitness function is to base it on some external factor with some relevance to the success / failure of walking, measuring the distance travelled, is therefore an ample choice.

General Guideline:

Page 6: Dowload Paper.doc.doc

The anticipated structure of a solution to a given problem is broken down into as many small parts as possible, the algorithm then searches for the optimal solution, as the optimal combination of the different solution sub-parts. This search space can be vast, resulting in very long search times. The genetic algorithm is capable of performing these vast global searches more efficient than random search.

How it worksGradually some individuals pick up useful traits, these traits are picked up from all overthe global search space, whenever useful traits are discovered, the local search area they rose from are investigated further by keeping the gene string that contains the coordinates to this hyperlocation alive in future generations. The best traits are then recombined when the parents reproduce and make children with possible even more successful hyperlocations, the population then slowly converge towards an optimal hyperlocation. The optimal hyperlocation is the gene string that contains the best candidate solution.

Exemplification.

GA = (F, P, R, M)

GA = The genetic algorithm.F = Fitness function.P = Population size.H = Hypothesis, represented by a vector of floats length NR = Replacement rate (in percent).M = Mutation rate (in percent).C = Crossover probability.N = Chromosome length (a vector of N floats).

1. Initialise the population with P random vectors of length N.2. Evaluate each H in P by F.3. Delete the worst 30% of H in P. 4. Replace the worst 20% by new random chromosomes.5. Reproduce the best 20% of H in P by crossover.6. Let the rest of H in P reproduce by crossover at probability C.7. Mutate the rest of H in P by probability C.8. Tweak the upper 50-80% of H in P.

Functions: Random vectors - are produced by filling the 13 floats with random numbers using a random number generator.Crossover is produced by randomly picking 6 of 13 genes from parent 1 and substituting them with the genes from parent 2.Mutation is produced by randomly choosing 1 gene, replacing it with a random number.Selection is done by randomly picking 3 genomes, using 2 of them to reproduce by replacing the third.Tweaking is done by randomly picking 3 of 13 genes, changing them by some small random amount, this technique resembles what is known as “shaking the weights” in neural networks, and is meant as an aide to get candidates unstuck and out of local optimums.

Originality.The reason for this somewhat original approach is that the simulation only allows testing one candidate at a time, this takes 30 simulated seconds which is approximately 5 real seconds, this means that we are bound by processing time to restrict the population of candidates to an arbitrary low number.

In the simulations we use a population size of 10. This creates a bias towards to small genetic diversity, and pushes the population to premature convergence in the direction of suboptimal local optima. To counter this effect we insert a high percentage of new random genes in each generations, we also set the mutation rate unusually high. The crossover rate is lower than normal to ensure breeding space for evolving individuals. Tweaking is introduced as a means to counter stagnation.

This approach creates very interesting dynamics in our genepool;

The best genome (nr 0) is ensured crossover with the second best (nr 1) replacing number 7.Genome x (randomly picked from the genome) is reproduced with a new random genome replacing nr 8.Then genome nr 8. which contains 50 % random genes is reproduced by a random genome replacing nr. 9 with a genome containing 75 % random genes.

This creates a system where genome 7-9 are systematically more random.

Genomes 5 and 6 are targeted for mutation. Genomes 3 and 4 are targeted for tweaking. Genome 0,1, and 2 are kept unchanged.

Page 7: Dowload Paper.doc.doc

This further extends the entropy ladder from 3-9 where each higher number issuccessively more random than the one before. This system creates a sliding effectthat makes sure that the genomes that are not competitive will fall down the ladder being exposed to consecutively more and more randomness, which is the only thing that can help them gain competitiveness again. However the successful genomes moves up the ladder and thus hides from the entropy preserving their fitness. This creates a king of the hill effect.

Figure 2. Entropy Ladder.

Figure 3 Example Run.

Figure 3 shows an example run, the first number of each column is the fitness, the second number in the column is the id of the genome, the column position indicates genome 0-9, 0 is the leftmost one. Survival of the fittest.The items marked green have just made a big leap up the ladder,the red ones has fallen down, the strength of the colour indicates the size of the change. This chart shows how the dynamics work by two streams of genes flowing through each other,namely the losers flowing out into oblivion, and the winners flowing up into position.

Nb. Crossover between floats is performed by letting the child become a random number in the max and min range of the two parents:

Parent 1 Parent 2 ChildCrossover: 3.020028 -0.382130 2.810501Crossover: 1.631748 1.229667 1.245619Crossover: 1.631748 1.229667 1.331516Crossover: -2.339981 -2.185591 -2.204014

Evaluating the results.

The model proves powerful and shows fast convergence, performing far better than random search, which is indicated by the fitness mean of 100 genetic generations reaching 150, while the fitness mean of 100 generations of random search is around 37.

Page 8: Dowload Paper.doc.doc

But it seams like the convergence is premature.The implementation is probably a victim of crowding, where above average individualsspread their genes so fast to the rest of the population, that new genes are blocked fromevolving into competition. This is because the above average individuals has taken the safe positionsin the entropy ladder by a fitness slightly higher that what can normally be achieved by sheer randomness. The new individuals carrying vital new gene patterns, do not get a chance to evolve, as they are demolished by the high grade entropy at the unsafe lower levels of the ladder.

This suggests that the above average individuals are far enough above average, to suppress random competition, and supports the belief that the algorithm produce very competitive candidate solutions.

The implementation called for a method to directly control the amount of nondeterminism, this was achieved by building a support platform with legs, surrounding the Tripod and ensuring it not to fall over by keeping it upright. Two main editions of the tripod exists Tripod_A and Tripod_B, Tripod_B is without the support structure and exhibit maximum nondeterminism (meaning the maximal nondeterminism in this model).

When comparing the convergence of model A and B, we clearly see that the maximum nondeterminism model B behaves much more chaotic, it takes longer to train, it achieves inferior end results, and it does not quite develop coherent gaiting patterns.The degree of nondeterminism can be adjusted by lowering the support structure.

Exploring the effects of varying nondeterminism, suggests that within time constraints genetic algorithms can only handle a certain level of nondeterminism before breaking into aimless chaos without converging at necessary speed.

Given infinite time, only the very tiniest fraction less than total nondeterminism is needed for the algorithm to eventually converge.

Improvements.

The results could be made far better by enlarging the gene pool by a 100 fold, from 10 to 1000 individuals.Or we could use a tournament solution where, the original approach were performed 50 times, and then havecompetitions with 10 and 10 of the 50 best candidates, ultimately breeding populations of higher and higher fitness. However this would increase running times from over night to the duration of a month.

The problem of crowding can be amended by maintain diversity along the current population, this can be done by incorporating density information into the selection process: where an individual's chance of being selected is decreased the greater the density of individualsin its neighbourhood.

Still good solutions can be lost due to random events like mutation and crossover. A common way to deal with this problem is to maintain a secondary population, the so-called archive, to which promising solutions in the population are stored after each generation. The archive can be integrated into the EA by including archive members in the selection process. Likewise the archive can be used to store individuals with exotic gene combinations, conserving the individuals with genes the farthest from the density clusters.

Appendix A. User guide.Appendix B. Bibliography.Appendix C. Test runs Appendix D. Code.

Appendix A. User guide.

On the CD:The enclosed mpeg movies shows the system in use.The papers folder includes 50 research papers of relevance.The code is contained in the text file : Tripod_A.tz

To run the software simulation:

Copy the folder breve to C.C:\breve\

Page 9: Dowload Paper.doc.doc

type: “cmd” at the windows command run prompt,then paste in (use right mouse button on dos promt):

For Tripod_A:

cd C:\breve\binSET BREVE_CLASS_PATH=C:\breve\LIB\CLASSESbreve C:\breve\COMM2C\Tripod_A.tz pico -u

For Tripod_B:

cd C:\breve\binSET BREVE_CLASS_PATH=C:\breve\LIB\CLASSESbreve C:\breve\COMM2C\Tripod_B.tz pico –u

For Tripod_B random search :

cd C:\breve\binSET BREVE_CLASS_PATH=C:\breve\LIB\CLASSESbreve C:\breve\COMM2C\Tripod_B_Random_Search.tz pico –u

The .tz files are the code in text format.

Commands:

Keypress : 1 : Performs bullet time pan.

Mouse : click-n-hold-left-mouse-button & move mouse: Looks around.

Mouse : Press F2 & click-n-hold-left-mouse-button & push mouse up and down : Zooms.

Mouse : Right click : Brings up command popup.

Mouse : Left click : Selects tripod & draws with sketch style.

Appendix B. Bibliography.

Holland, (1975). “Adaptation in Natural and Artificial Systems.” University of Michigan.

T. C. Fogarty, (1989). ”Varying the probability of mutation in the genetic algorithm. “In Schaffer, Proceedings of the Third International Conference on Genetic Algorithms andtheir Applications, pp. 104–109.

Grefenstette & Baker, (1989). “How genetic algorithms work: a critical look at implicit parallelism”In Proceedings of the Third International Conference on Genetic Algorithms.

W Hart & R Belew, (1991). ”Optimizing an arbitrary function is hard for the genetic algorithm.”In Proceedings of the Fourth International Conference on Genetic Algorithms.

Heinz Muhlenbein, (1993).“Mathematical Analysis of Evolutionary Algorithms for Optimization”GMD – Schloss Birlinghoven,Germany

Page 10: Dowload Paper.doc.doc

Muehlenbein & Asoh (1994). ”On the mean convergence time of evolutionary algorithms without selection and mutation. “Parallel Problem Solving from Nature

Mitchell, Holland, & Forrest, (1994). “When will a genetic algorithm outperform hill climbing? Advances in Neural Information Processing”

Ralf Salomon, (1995). “Evaluating Genetic Algorithm performance” Elsevier Science.

[Wolpert & Macready, (1995). the "No Free Lunch" theorem.

Goldberg & Miller, (1997). “The genetic ordering in LLGA”.

Prugel-Bennet & J.L. Shapiro, (1997).”An analysis of a genetic algorithm for simple randomising systems” Physica D, 104:75–114,

G. Harik, (1999). “Linkage learning via probabilistic modeling in the ecga.”Technical Report IlliGal 99010, University of Illinois, Urbana-Champaign.

Droste, Jansen, & Wegener, (1999). “Perhaps not a free lunch but at least a free appetizer.”In GECCO-99: Proceedings of the Genetic and Evolutionary Computation Conference.

Heckendorn & Whitley, (1999). ”Polynomial time summary statistics for a generalization”of MAXSAT. In GECCO-99: Proceedings of the Genetic and EvolutionaryComputation Conference.

M. Vose, (1999). “The Simple Genetic Algorithm: Foundations and Theory.” MIT Press, Cambridge.

Goldberg , (2001). “Genetic Algorithms in Search,Optimization and Machine Learning.”

Goldberg, Yassine, & Chen, (2003). “DSM clustering in DSMDGA”.

Gunter Rudolph & Alexandru Agapie“Convergence Properties of Some Multi-Objective Evolutionary Algorithms”

Appendix C. Test runs.

Test run ofTripod_A Low degree of nondeterminism:

: 58 5: 34 8: 30 9: 14 7: 13 3: 13 1: 13 2: 9 4: 3 0: 2 6: 79 5: 45 8: 39 9: 22 6: 21 3: 19 7: 18 4: 17 1: 14 2: 3 0: 86 5: 50 6: 50 8: 44 3: 43 9: 38 2: 18 4: 9 1: 9 7: 3 0: 89 5: 60 2: 53 6: 51 8: 45 9: 44 3: 28 7: 24 1: 22 4: 17 0: 90 5: 81 8: 54 6: 51 3: 48 4: 43 2: 43 9: 27 0: 22 1: 22 7: 90 5: 67 2: 67 8: 67 4: 53 6: 35 3: 28 9: 19 0: 15 1: 12 7: 89 8: 75 4: 57 2: 57 5: 52 6: 28 3: 28 1: 20 0: 17 9: 9 7: 82 4: 79 8: 67 2: 47 3: 44 6: 35 5: 34 1: 16 9: 11 0: 6 7: 90 5: 83 4: 46 2: 44 7: 40 6: 40 8: 36 9: 30 0: 22 1: 18 3: 81 5: 66 2: 65 7: 42 8: 42 9: 37 3: 35 4: 20 6: 19 1: 15 0: 100 5: 68 7: 61 4: 48 9: 42 2: 33 6: 32 3: 20 8: 8 1: 7 0: 124 5: 90 6: 71 7: 61 2: 53 4: 51 9: 43 1: 35 0: 24 3: 13 8: 123 5: 103 6: 72 2: 72 0: 70 7: 57 4: 54 1: 38 9: 35 3: 23 8: 87 0: 79 9: 70 1: 66 7: 55 4: 55 5: 54 2: 43 8: 38 6: 22 3: 109 0: 81 8: 77 9: 75 1: 58 4: 50 2: 48 7: 47 3: 23 5: 19 6

Page 11: Dowload Paper.doc.doc

: 109 0: 93 8: 88 3: 75 9: 62 7: 59 1: 48 4: 35 6: 32 5: 23 2: 122 0: 101 3: 96 4: 87 8: 81 9: 64 1: 63 7: 58 5: 17 6: 16 2: 120 0: 107 3: 98 9: 95 5: 90 8: 65 7: 64 1: 44 4: 37 2: 15 6: 124 5: 114 0: 113 8: 105 3: 96 4: 67 6: 54 7: 42 2: 35 9: 31 1: 133 5: 113 0: 108 3: 107 4: 92 8: 89 1: 88 6: 85 2: 43 7: 32 9: 134 5: 114 0: 107 4: 107 3: 91 1: 89 8: 88 2: 60 9: 41 7: 31 6: 145 6: 136 5: 133 1: 115 0: 106 4: 89 2: 85 9: 60 8: 48 3: 28 7: 181 6: 165 0: 145 1: 134 5: 115 4: 71 3: 56 2: 39 8: 35 9: 12 7: 195 6: 164 0: 154 4: 150 1: 114 3: 101 5: 59 2: 27 9: 23 8: 6 7: 198 6: 168 4: 167 0: 124 3: 91 9: 85 5: 59 2: 52 1: 20 8: 4 7: 186 6: 135 3: 102 0: 67 4: 59 2: 43 9: 39 8: 37 1: 33 5: 22 7: 196 6: 95 3: 87 5: 66 0: 61 8: 54 2: 45 9: 31 4: 26 1: 15 7: 180 6: 120 3: 90 5: 53 0: 43 9: 36 8: 21 4: 21 2: 15 1: 7 7: 192 6: 64 3: 59 0: 34 5: 33 8: 25 9: 16 1: 16 4: 13 7: 12 2: 198 6: 104 3: 87 4: 85 5: 55 0: 44 2: 30 8: 24 7: 23 9: 19 1: 201 6: 121 4: 109 3: 89 0: 67 5: 51 2: 48 1: 33 8: 27 7: 13 9: 200 6: 137 3: 96 4: 92 8: 91 0: 75 5: 74 1: 56 2: 49 9: 21 7: 108 8: 103 0: 98 5: 87 6: 85 3: 68 2: 65 1: 62 4: 24 7: 22 9: 119 5: 115 8: 99 4: 98 0: 56 6: 55 1: 55 2: 51 3: 21 7: 17 9: 123 4: 117 8: 102 0: 71 6: 55 5: 46 2: 41 1: 23 3: 21 7: 10 9: 150 6: 120 4: 112 8: 98 3: 94 0: 91 7: 88 5: 60 1: 44 2: 6 9: 175 6: 120 4: 115 8: 115 7: 110 3: 110 2: 102 5: 93 0: 73 9: 38 1: 184 6: 126 5: 126 4: 121 2: 116 8: 58 3: 51 1: 49 7: 44 0: 44 9: 185 6: 125 4: 121 8: 114 2: 84 0: 48 5: 44 1: 42 3: 34 9: 25 7: 178 6: 122 2: 114 8: 61 1: 52 4: 51 7: 46 0: 28 3: 23 5: 13 9: 186 6: 116 8: 79 2: 61 1: 40 9: 31 0: 26 7: 25 4: 21 3: 18 5: 188 6: 119 8: 95 2: 53 1: 43 0: 42 7: 38 9: 30 5: 15 4: 9 3: 187 6: 121 4: 120 8: 96 2: 60 7: 51 1: 38 9: 35 0: 33 3: 24 5: 181 6: 170 4: 120 8: 90 0: 74 2: 64 1: 40 7: 36 9: 30 3: 19 5: 176 6: 125 4: 118 8: 108 0: 87 2: 60 9: 36 1: 31 7: 24 3: 16 5: 185 6: 123 8: 123 4: 108 0: 71 9: 69 2: 50 5: 40 1: 36 7: 18 3: 180 6: 165 4: 118 8: 97 1: 96 0: 46 2: 42 7: 36 9: 29 3: 28 5: 188 6: 181 4: 123 9: 119 8: 116 1: 91 0: 48 5: 27 2: 22 7: 18 3: 178 6: 173 4: 168 9: 137 2: 117 1: 87 8: 83 0: 39 7: 23 5: 23 3: 182 4: 165 6: 138 7: 126 1: 123 9: 66 2: 49 8: 36 0: 14 5: 11 3: 169 7: 135 0: 111 4: 97 9: 77 2: 63 6: 48 1: 41 3: 35 8: 21 5: 184 7: 169 0: 132 3: 131 2: 101 9: 91 4: 43 1: 31 6: 28 5: 16 8: 182 7: 176 3: 164 0: 136 2: 134 6: 93 9: 76 1: 50 8: 46 4: 18 5: 186 7: 177 3: 171 0: 154 2: 144 8: 133 9: 118 6: 48 1: 23 4: 8 5: 186 7: 167 0: 156 2: 143 8: 135 1: 68 3: 52 5: 49 9: 44 6: 20 4: 186 7: 182 0: 165 6: 157 2: 150 1: 124 9: 112 8: 55 5: 53 4: 31 3: 214 6: 190 0: 183 1: 163 9: 157 2: 152 8: 130 5: 124 4: 92 7: 64 3: 231 6: 216 2: 198 1: 195 0: 177 9: 176 8: 125 7: 88 5: 64 4: 43 3: 243 6: 240 2: 209 0: 191 5: 180 1: 176 9: 67 8: 54 3: 51 7: 47 4: 245 6: 244 2: 220 5: 204 0: 196 1: 180 3: 164 7: 75 8: 70 9: 18 4: 244 2: 240 6: 231 5: 225 1: 225 0: 204 7: 191 8: 68 3: 48 4: 31 9: 247 2: 235 5: 225 1: 213 7: 200 0: 190 3: 86 6: 66 8: 26 9: 24 4: 248 2: 236 5: 231 1: 190 3: 189 8: 77 0: 75 7: 31 6: 25 9: 19 4: 251 2: 235 1: 231 8: 159 6: 105 3: 101 5: 77 4: 45 0: 34 9: 29 7: 250 2: 240 1: 181 8: 167 6: 166 0: 106 4: 91 3: 52 5: 46 9: 36 7: 249 8: 248 2: 233 1: 206 6: 183 5: 159 0: 73 3: 57 4: 36 9: 29 7: 246 2: 241 1: 218 6: 197 0: 193 4: 149 8: 89 3: 67 5: 20 9: 11 7: 250 2: 242 1: 223 6: 205 0: 187 5: 118 4: 81 3: 68 8: 10 7: 10 9: 248 2: 241 1: 223 6: 221 5: 211 0: 185 8: 126 7: 58 4: 47 9: 30 3: 248 2: 242 1: 225 6: 198 8: 181 0: 169 4: 161 5: 72 3: 60 7: 42 9: 247 2: 242 1: 219 4: 199 8: 188 3: 185 7: 171 6: 158 5: 89 0: 32 9: 248 2: 236 4: 229 3: 227 7: 219 8: 215 5: 213 6: 194 1: 58 0: 14 9: 242 7: 240 5: 239 4: 236 2: 229 1: 208 8: 198 6: 162 3: 23 0: 8 9: 242 7: 242 5: 238 4: 236 2: 215 3: 208 6: 189 1: 73 8: 28 0: 5 9: 242 2: 240 7: 238 5: 238 4: 229 3: 193 1: 185 8: 175 6: 36 0: 14 9: 243 5: 242 2: 240 4: 235 3: 206 6: 203 7: 190 1: 64 8: 23 9: 17 0: 243 2: 242 5: 239 4: 220 6: 219 3: 186 8: 183 1: 158 9: 72 7: 20 0: 247 2: 244 5: 237 4: 223 3: 217 6: 211 9: 165 0: 146 7: 84 8: 64 1: 245 2: 241 5: 237 9: 235 4: 230 3: 215 7: 213 6: 65 0: 32 8: 31 1

Test run of

Page 12: Dowload Paper.doc.doc

Tripod_B High degree of nondeterminism:

: 15 4: 10 3: 8 9: 8 5: 7 8: 6 6: 4 7: 3 2: 1 0: 1 1: 15 5: 14 6: 10 3: 10 4: 10 7: 7 9: 6 1: 6 8: 5 2: 2 0: 25 5: 17 4: 10 6: 10 8: 8 0: 8 7: 8 1: 7 2: 5 9: 5 3: 24 8: 18 5: 17 6: 16 0: 14 9: 10 4: 8 2: 7 1: 5 7: 4 3: 32 5: 28 8: 17 6: 14 3: 13 7: 11 4: 9 9: 9 2: 7 0: 5 1: 26 8: 19 5: 17 6: 17 2: 16 7: 12 3: 11 9: 10 4: 8 1: 7 0: 33 7: 32 5: 23 2: 19 8: 11 9: 11 1: 10 3: 9 6: 8 4: 4 0: 38 5: 30 7: 24 8: 23 9: 21 6: 21 1: 20 0: 19 2: 17 4: 8 3: 52 6: 33 8: 28 7: 25 5: 24 1: 23 4: 20 2: 19 9: 15 0: 12 3: 79 5: 61 9: 46 8: 44 4: 37 6: 35 1: 29 7: 22 2: 19 0: 9 3: 61 7: 58 2: 51 4: 35 5: 32 0: 24 8: 24 9: 24 1: 21 6: 5 3: 73 4: 59 2: 47 7: 40 8: 24 6: 23 5: 22 0: 21 9: 11 1: 6 3: 76 4: 48 6: 37 7: 30 9: 28 1: 27 8: 26 2: 24 3: 19 0: 18 5: 57 4: 55 6: 53 7: 45 9: 36 2: 31 1: 27 8: 27 5: 22 0: 20 3: 89 5: 63 1: 57 7: 49 8: 47 4: 42 6: 33 2: 31 0: 21 9: 9 3: 68 1: 50 5: 44 9: 43 8: 37 4: 29 7: 27 0: 25 6: 17 2: 12 3: 104 9: 69 3: 68 8: 61 1: 61 5: 46 7: 38 2: 34 0: 27 6: 19 4: 89 8: 72 5: 54 2: 53 9: 42 3: 37 1: 32 0: 22 6: 22 4: 18 7: 94 5: 72 1: 64 9: 63 8: 60 3: 47 7: 42 4: 35 6: 34 2: 32 0: 53 4: 48 8: 43 6: 41 5: 39 1: 38 3: 30 9: 25 7: 23 2: 12 0: 63 1: 43 6: 42 7: 38 9: 37 5: 31 3: 23 2: 22 4: 18 8: 8 0: 89 5: 54 6: 48 1: 43 4: 22 9: 22 3: 20 7: 18 8: 14 0: 10 2: 107 5: 42 1: 40 7: 36 8: 35 4: 27 9: 26 6: 22 0: 15 3: 14 2: 94 5: 67 3: 41 6: 40 1: 27 8: 24 9: 22 2: 20 7: 16 4: 14 0: 104 5: 82 3: 74 6: 55 2: 44 7: 31 9: 28 1: 25 8: 17 4: 7 0: 64 7: 58 5: 53 8: 49 6: 40 9: 40 2: 35 3: 18 1: 14 4: 4 0: 73 3: 64 5: 54 7: 50 6: 50 8: 39 4: 32 9: 32 1: 25 2: 16 0: 84 3: 70 4: 61 7: 56 6: 54 9: 39 5: 32 1: 28 8: 23 2: 12 0: 114 3: 73 4: 56 0: 50 7: 46 8: 46 5: 37 9: 35 2: 26 1: 22 6: 80 3: 76 7: 59 9: 44 6: 41 0: 38 1: 36 5: 35 4: 32 2: 31 8: 89 2: 84 4: 83 3: 76 7: 64 8: 54 6: 53 0: 47 1: 45 5: 41 9: 90 3: 63 2: 57 5: 48 7: 47 4: 46 8: 38 6: 35 1: 30 9: 23 0: 76 5: 67 2: 55 1: 50 3: 49 7: 45 9: 40 8: 28 4: 19 0: 14 6: 128 8: 94 5: 90 1: 73 2: 72 0: 64 9: 48 3: 31 6: 25 7: 24 4: 113 1: 72 8: 70 5: 53 0: 41 9: 39 2: 37 6: 26 4: 23 3: 11 7: 133 1: 95 8: 70 4: 61 5: 55 2: 45 6: 43 3: 33 9: 30 0: 5 7: 95 4: 81 8: 71 1: 68 5: 62 2: 51 0: 46 9: 38 3: 16 6: 11 7: 113 1: 52 4: 51 5: 50 0: 50 8: 39 2: 36 9: 30 3: 24 7: 21 6: 95 5: 56 8: 47 1: 44 2: 42 0: 38 4: 33 3: 22 6: 18 7: 17 9: 118 5: 72 2: 70 0: 47 1: 46 6: 39 8: 35 4: 31 7: 23 3: 13 9: 72 8: 66 0: 63 5: 45 1: 43 2: 39 7: 25 4: 22 6: 19 3: 5 9: 96 7: 72 1: 55 6: 54 2: 37 8: 28 3: 26 5: 26 0: 11 4: 5 9: 52 0: 48 6: 42 7: 40 1: 35 2: 33 8: 30 5: 26 4: 23 3: 4 9: 64 0: 59 1: 44 2: 35 5: 32 8: 30 4: 28 7: 25 3: 23 6: 12 9: 98 4: 85 1: 62 0: 52 2: 47 3: 42 7: 41 5: 33 6: 22 8: 16 9: 80 1: 70 0: 62 2: 57 9: 56 4: 53 3: 49 6: 45 7: 34 5: 26 8: 93 7: 70 6: 64 2: 55 1: 41 0: 37 4: 35 9: 31 3: 26 5: 10 8: 65 7: 63 1: 57 0: 53 6: 51 2: 41 5: 37 4: 35 3: 13 8: 13 9: 93 6: 70 0: 59 5: 52 1: 41 2: 40 3: 36 7: 30 4: 23 9: 17 8: 70 4: 61 6: 58 7: 56 9: 53 0: 49 2: 45 1: 40 5: 20 3: 17 8: 79 6: 68 1: 65 3: 48 9: 48 4: 47 0: 42 5: 39 8: 36 2: 30 7: 68 1: 54 6: 50 3: 47 2: 47 4: 40 5: 39 0: 33 9: 28 8: 11 7: 65 1: 56 6: 41 3: 40 5: 40 0: 39 7: 39 9: 38 2: 32 4: 21 8: 79 4: 72 5: 59 7: 58 1: 54 9: 48 6: 44 8: 44 3: 23 2: 20 0: 64 5: 57 3: 53 4: 53 9: 48 8: 43 1: 34 0: 32 7: 29 6: 20 2: 98 7: 73 5: 72 3: 32 9: 32 1: 31 4: 29 0: 28 6: 25 8: 17 2: 82 2: 76 5: 69 9: 54 7: 42 4: 36 3: 32 1: 28 0: 19 6: 14 8: 87 9: 72 2: 70 1: 66 5: 64 3: 48 7: 48 8: 41 0: 40 6: 19 4: 71 5: 69 3: 69 1: 59 9: 52 8: 37 0: 35 2: 35 6: 29 7: 26 4: 71 3: 62 1: 62 9: 61 6: 40 2: 37 7: 37 8: 34 0: 29 5: 19 4: 107 6: 87 5: 82 8: 78 2: 72 3: 59 1: 44 9: 24 4: 23 0: 18 7: 88 1: 86 6: 66 5: 56 4: 53 2: 43 0: 41 8: 36 3: 35 9: 13 7: 81 6: 81 8: 62 4: 61 9: 61 5: 47 3: 40 1: 34 0: 27 2: 10 7: 81 9: 77 0: 62 8: 60 6: 57 4: 47 3: 45 1: 41 5: 35 2: 12 7

Page 13: Dowload Paper.doc.doc

: 99 0: 78 1: 71 4: 64 6: 57 9: 40 8: 36 3: 21 5: 17 2: 8 7: 118 0: 68 1: 58 9: 57 6: 54 3: 43 4: 34 8: 30 5: 17 2: 7 7: 100 3: 84 5: 83 2: 71 6: 58 1: 54 8: 52 9: 51 0: 41 4: 9 7: 79 2: 73 3: 71 1: 63 6: 45 5: 36 8: 28 0: 24 9: 21 4: 21 7: 67 6: 62 5: 61 3: 50 2: 38 7: 34 1: 30 4: 26 0: 20 9: 14 8: 89 0: 69 3: 59 5: 52 6: 50 4: 44 7: 42 9: 39 2: 32 1: 8 8: 64 0: 59 6: 53 3: 47 9: 39 4: 37 1: 34 5: 31 7: 22 2: 11 8: 92 3: 91 0: 54 7: 53 9: 46 6: 33 4: 31 1: 29 5: 22 8: 19 2: 92 5: 63 7: 61 6: 58 9: 57 3: 50 0: 42 4: 26 2: 24 1: 19 8: 108 3: 75 7: 64 5: 56 2: 51 0: 44 9: 30 4: 28 1: 25 6: 7 8: 113 3: 100 5: 71 2: 69 1: 60 9: 52 7: 37 6: 36 4: 31 0: 6 8: 105 1: 91 5: 74 6: 53 9: 49 4: 47 2: 42 3: 27 8: 25 0: 20 7: 86 1: 81 6: 54 8: 51 0: 42 2: 37 5: 36 9: 32 4: 24 3: 10 7: 130 6: 90 8: 64 1: 63 0: 49 5: 45 2: 36 3: 34 9: 29 4: 24 7: 109 6: 89 1: 81 8: 72 3: 60 4: 50 0: 39 2: 28 5: 26 9: 15 7: 88 3: 85 8: 68 5: 57 6: 54 1: 49 0: 36 4: 33 7: 28 9: 23 2: 103 6: 71 4: 65 1: 53 3: 35 0: 34 8: 32 7: 26 5: 22 9: 12 2: 86 7: 72 1: 60 6: 55 0: 54 9: 48 4: 42 3: 35 8: 34 5: 24 2: 107 7: 85 6: 79 1: 58 9: 49 4: 43 0: 29 3: 21 8: 21 5: 9 2: 82 6: 74 1: 65 5: 60 7: 60 0: 58 9: 45 8: 40 3: 34 4: 33 2: 86 7: 70 9: 64 4: 64 0: 57 6: 56 8: 42 5: 34 1: 30 3: 18 2: 68 9: 65 0: 55 7: 45 3: 40 4: 32 5: 27 6: 27 8: 23 1: 23 2: 59 4: 48 3: 46 7: 38 0: 36 6: 33 5: 31 9: 15 8: 10 2: 10 1: 68 2: 57 8: 55 4: 53 1: 48 3: 37 7: 36 0: 33 6: 21 9: 19 5: 55 8: 53 9: 50 4: 46 1: 39 0: 36 6: 33 7: 29 3: 28 2: 21 5: 67 8: 57 0: 56 1: 39 9: 36 2: 33 4: 22 3: 18 7: 14 5: 13 6: 56 1: 46 8: 45 0: 42 3: 39 2: 38 4: 35 7: 28 9: 18 5: 8 6: 56 3: 52 7: 50 8: 41 1: 26 4: 24 2: 24 0: 13 5: 13 9: 8 6: 58 7: 52 1: 38 8: 34 3: 31 4: 30 0: 27 9: 21 2: 20 5: 4 6: 102 7: 49 1: 43 8: 43 2: 35 5: 31 3: 25 6: 20 4: 12 0: 10 9: 72 7: 63 8: 56 4: 50 1: 45 3: 32 6: 23 0: 22 2: 21 5: 11 9: 94 9: 73 7: 55 4: 48 1: 47 8: 44 0: 36 2: 33 3: 18 6: 15 5: 63 9: 60 4: 50 7: 46 0: 34 2: 33 1: 28 6: 27 3: 27 8: 20 5: 77 7: 58 0: 47 4: 42 1: 40 5: 29 9: 27 3: 25 6: 24 8: 21 2: 95 1: 72 7: 57 0: 54 5: 29 8: 29 4: 24 3: 24 6: 15 2: 12 9: 68 0: 67 1: 48 6: 43 7: 36 4: 26 5: 23 8: 19 3: 15 9: 13 2: 65 7: 58 0: 57 1: 46 4: 42 3: 42 6: 36 2: 19 8: 18 5: 11 9: 75 3: 67 4: 54 5: 53 8: 53 0: 52 7: 31 6: 25 1: 18 2: 17 9: 68 3: 55 1: 47 0: 44 6: 39 7: 37 4: 37 5: 36 2: 21 9: 20 8: 64 4: 60 0: 58 1: 48 3: 30 6: 22 5: 18 2: 16 7: 12 9: 9 8: 61 3: 45 4: 37 1: 36 7: 30 0: 29 6: 20 9: 19 2: 14 5: 7 8: 88 5: 48 4: 47 2: 36 1: 32 3: 27 7: 19 6: 16 8: 15 9: 14 0: 95 5: 81 4: 52 6: 50 8: 29 2: 21 1: 20 9: 18 3: 15 7: 7 0: 76 4: 65 5: 64 3: 41 7: 40 8: 36 6: 34 2: 19 9: 19 1: 10 0: 145 3: 96 5: 72 4: 44 8: 36 1: 35 2: 33 6: 25 7: 18 9: 5 0: 117 3: 114 2: 95 4: 83 9: 61 7: 61 5: 49 8: 37 1: 18 0: 17 6: 98 3: 82 2: 56 4: 56 5: 48 0: 47 7: 44 1: 43 9: 39 6: 33 8: 123 1: 82 3: 75 4: 69 0: 57 2: 50 6: 41 7: 40 9: 25 5: 21 8: 135 3: 106 6: 102 9: 95 1: 69 7: 63 8: 54 2: 48 4: 40 0: 16 5: 135 8: 118 4: 92 1: 90 3: 73 6: 67 9: 54 0: 49 2: 25 7: 18 5: 151 3: 147 2: 102 8: 95 4: 79 0: 73 9: 42 6: 39 1: 11 5: 10 7: 225 2: 120 5: 98 3: 89 8: 72 4: 67 0: 30 7: 30 9: 16 1: 15 6: 138 2: 105 3: 87 5: 86 4: 75 8: 50 9: 23 0: 21 1: 20 7: 20 6: 118 4: 78 7: 70 8: 69 2: 61 9: 50 5: 49 3: 35 1: 30 6: 9 0: 130 7: 91 4: 84 9: 63 3: 57 8: 54 2: 44 1: 43 5: 14 6: 6 0: 82 7: 81 9: 60 3: 51 4: 47 5: 36 8: 32 2: 25 1: 24 6: 8 0: 130 9: 79 5: 61 1: 58 3: 56 7: 40 4: 36 8: 29 6: 14 2: 6 0: 129 9: 62 1: 57 4: 51 6: 43 5: 32 8: 31 0: 29 3: 26 7: 15 2

Appendix D. Code.

The code is written in the c++ like object oriented language Steve. Most of the code involves setting up the physical simulation, and constructing thetripod.

Page 14: Dowload Paper.doc.doc

The algorithm specific code is the only code that is fully commented, as the other code is regarded trivial.

File: Tripod_A.tz :

# Tripod

@use PhysicalControl.@use Link.@use File.@use Genome.@use Shape.@use Stationary.@use MultiBody.@define SPEED_K 19.

Controller Walker.

PhysicalControl : Walker {+ variables:SelectedGenomes, GenePopulation (list).currentSelectedGenome (int).Tripod (object).equalTime (float).

locked (int).lockMenu (object).cloudTexture (int).

+ to init:floorShape (object).floor (object).number (int).item (object). file (object).

equalTime =0.

self disable-freed-instance-protection.

locked = 0.self set-random-seed-from-dev-random.self enable-lighting.self enable-smooth-drawing.self move-light to (0, 20, 0).

# Create the floor for the Tripod to walk on.floorShape = (new Shape init-with-cube size (1000, 2, 1000)).floor = new Stationary.floor register with-shape floorShape at-location (0, 0, 0).floor catch-shadows.floor set-color to (1.0, 1.0, 1.0).

cloudTexture = (self load-image from "images/clouds.png").

self enable-shadow-volumes.self enable-reflections. self half-gravity.

self set-background-color to (.4, .6, .9).self set-background-texture to cloudTexture.

# Create the Tripod.Tripod = new TripodTemplate.Tripod move to (0, 6, 0).self offset-camera by (3, 13, -13).self watch item Tripod.

Page 15: Dowload Paper.doc.doc

GenePopulation = 10 new genoms.

# Create list GenePopulationforeach item in GenePopulation: {(item set-number to number).# print "Genome ", (GenePopulation{number} get-number), (GenePopulation{number} get-distance).number += 1.}

# Starts the programself pick-Genomes.

# set up the menus...lockMenu = (self add-menu named "Lock Genome" for-method "toggle-Genome-lock").self add-menu-separator.self add-menu named "Save Current Genome" for-method "save-current-genome".self add-menu named "Load Into Current Genome" for-method "load-into-current-genome".

# schedule the first Genome change and we're ready to go.self schedule method-call "change-Genomes" at-time (self get-time) + 30.0.self display-current-Genome.

+ to display-current-Genome:currentNumber (int).currentNumber = (SelectedGenomes{currentSelectedGenome} get-number).self set-display-text to "Genome #$currentNumber" at-x -.95 at-y -.9.

+ to iterate:SelectedGenomes{currentSelectedGenome} control robot Tripod at-time ((self get-time) - equalTime + 1).super iterate.

+ to pick-Genomes:

sort GenePopulation with compare-distance.

SelectedGenomes{0} = GenePopulation{0}.SelectedGenomes{1} = GenePopulation{1}.SelectedGenomes{2} = GenePopulation{2}.SelectedGenomes{3} = GenePopulation{3}.SelectedGenomes{4} = GenePopulation{4}.SelectedGenomes{5} = GenePopulation{5}.SelectedGenomes{6} = GenePopulation{6}.SelectedGenomes{7} = GenePopulation{7}.SelectedGenomes{8} = GenePopulation{8}.SelectedGenomes{9} = GenePopulation{9}.

currentSelectedGenome = 0.

+ to change-Genomes:newGenome (int).newOffset (vector). myMobile (list). Current_distance (float). Acummulated_distance (float). New_Acummulated_distance (float).

Current_distance = |(Tripod get-location)|.Acummulated_distance = SelectedGenomes{currentSelectedGenome} get-distance.New_Acummulated_distance =((1 * Acummulated_distance) + 2 * Current_distance) / 3.

SelectedGenomes{currentSelectedGenome} set-distance to New_Acummulated_distance .

free Tripod.

Tripod = new TripodTemplate.

Page 16: Dowload Paper.doc.doc

Tripod move to (0, 6, 0).

self offset-camera by (3, 5, -23).self watch item Tripod.Tripod set-color.

equalTime = 0. equalTime = (self get-time).

currentSelectedGenome += 1.

if currentSelectedGenome > 9: {self breed-new-genoms.self pick-Genomes.}

newGenome = (SelectedGenomes{currentSelectedGenome} get-number).

# schedule a new Genome change in 30 seconds.self schedule method-call "change-Genomes" at-time (self get-time) + 30.0.self display-current-Genome.

+ to breed-new-genoms: Testnum (int). number(int). GenomeNr (int). Fitness (int). random_select (int). item (object). Da_Genome (object).

sort SelectedGenomes with compare-distance.

number =-1.

foreach item in SelectedGenomes: { number += 1. GenePopulation{number} = SelectedGenomes{number}. Fitness = SelectedGenomes{number} get-distance. GenomeNr = SelectedGenomes{number} get-number. printf ":",Fitness,GenomeNr.}

# secure flow of new genetic material GenePopulation{9} randomize. GenePopulation{7} randomize.

# breedSelectedGenomes{0} True_breed with GenePopulation{(random[9])} to-child GenePopulation{8}.GenePopulation{8} True_breed with GenePopulation{7} to-child GenePopulation{9}.SelectedGenomes{0} True_breed with SelectedGenomes{1} to-child GenePopulation{7}. random_select= random[9]. SelectedGenomes{random_select} True_breed with SelectedGenomes{random[9]} to-child GenePopulation{(random[9])}.

# mutate(GenePopulation{5} get-genome) mutate.(GenePopulation{6} get-genome) mutate.random_select= random[9].(GenePopulation{random_select} get-genome) mutate.

# Tweak (GenePopulation{3} get-genome) Tweak.(GenePopulation{4} get-genome) Tweak. random_select= random[9].(GenePopulation{random_select} get-genome) Tweak.

Page 17: Dowload Paper.doc.doc

+ to compare-distance of a (object) with b (object):result (float).result = (b get-distance) - (a get-distance).return result.

# the following methods are accessed from the simulation menu.

+ to toggle-Genome-lock:if locked == 1: {locked = 0.Tripod center.self schedule method-call "change-Genomes" at-time (self get-time) + 30.0.lockMenu uncheck.} else {locked = 1.lockMenu check.}

+ to save-current-genome:(SelectedGenomes{currentSelectedGenome} get-genome) save-with-dialog.

+ to load-into-current-genome:(SelectedGenomes{currentSelectedGenome} get-genome) load-with-dialog.

+ to catch-key-2-down:self save-as-xml file "world1.xml" .

+ to catch-key-3-down:self save-as-xml file "world2.xml" .

+ to catch-key-1-down:newOffset (vector).

newOffset = random[(90, 10, 90)] + (-15, 1, -15).if |newOffset| < 14: newOffset = 14 * newOffset/|newOffset|.self bullet-pan-camera-offset by newOffset steps 100.

newOffset = random[(90, 10, 90)] + (-15, 1, -15).if |newOffset| < 14: newOffset = 14 * newOffset/|newOffset|.self bullet-pan-camera-offset by newOffset steps 100.

newOffset = random[(90, 10, 90)] + (-15, 1, -15).if |newOffset| < 14: newOffset = 14 * newOffset/|newOffset|.self bullet-pan-camera-offset by newOffset steps 100.# look at self from newOffset.

}

Object : genoms {+ variables:distanceTraveled (float).genome (object).

number (int).

+ to set-number to n (int):number = n.

+ to get-number:return number.

+ to init:genome = new TripodGenome.self randomize.

+ to randomize:

Page 18: Dowload Paper.doc.doc

genome randomize.

+ to get-genome:return genome.

+ to breed with otherGenome (object) to-child child (object):(child get-genome) crossover from-parent-1 (otherGenome get-genome) from-parent-2 (self get-genome).

+ to True_breed with otherGenome (object) to-child child (object): Genome_1 (object). Genome_2 (object). Genome_3 (object). Returnee (float). Returnee2 (float). Returnee3 (float). n, nn (int). Genome_1 = (otherGenome get-genome). Genome_2 = (self get-genome). Genome_3 = (child get-genome). # Copying Genome_1 to Genome_3 for n=0, n<13, n+=1: { Returnee = Genome_1 ReturnGeene to n . Genome_3 SetGeene to Returnee to n. } # Producing Crossover for n=0, n<7, n+=1: { nn = (random[12]). Returnee = Genome_2 ReturnGeene to nn . Returnee2 = Genome_1 ReturnGeene to nn . Returnee3 = random[(Returnee - Returnee2)]. if Returnee3 < 0 : Returnee3 = (Returnee3 * -1) + Returnee. if Returnee3 > 0 : Returnee3 = Returnee3 + Returnee2. Genome_3 SetGeene to Returnee to nn. # print "Crossover: ", Returnee , Returnee2 , Returnee3 . }

+ to control robot theRobot (object) at-time t (float):

theRobot set-joint-velocity-1 to SPEED_K * (genome calculate-torque-1 at (t )).theRobot set-joint-velocity-2 to SPEED_K * -(genome calculate-torque-2 at (t )).theRobot set-joint-velocity-3 to SPEED_K * -(genome calculate-torque-3 at (t )).theRobot set-joint-velocity-4 to SPEED_K * -(genome calculate-torque-4 at (t )).theRobot set-joint-velocity-5 to SPEED_K * -(genome calculate-torque-5 at (t )).theRobot set-joint-velocity-6 to SPEED_K * (genome calculate-torque-6 at (t)).

+ to set-distance to value (float):distanceTraveled = value.

+ to get-distance:return distanceTraveled.}

Genome : TripodGenome {+ variables:genomeData (13 floats).

+ to randomize:genomeData[0] = random[5.0] - 2.5.

genomeData[1] = random[2.0] - 1.0 .genomeData[2] = random[2.0] - 1.0 .genomeData[3] = random[2.0] - 1.0 .genomeData[4] = random[2.0] - 1.0 .

Page 19: Dowload Paper.doc.doc

genomeData[5] = random[2.0] - 1.0 .genomeData[6] = random[2.0] - 1.0 .

genomeData[7] = random[6.3] - 3.15. genomeData[8] = random[6.3] - 3.15.genomeData[9] = random[6.3] - 3.15. genomeData[10] = random[6.3] - 3.15.genomeData[11] = random[6.3] - 3.15.genomeData[12] = random[6.3] - 3.15.

+ to PrintGenome: n (int). for n=0, n<13, n+=1: { print "Genome: ", n, " ", genomeData[n]. }

+ to ReturnGeene to value (int): # print "return genomeData: ", genomeData[value]. return genomeData[value] . + to SetGeene to number (int) to value (float): # print "SetGeene : ", number,value. genomeData[number] = value.

+ to calculate-torque-1 at time (float):return .5 * (sin(genomeData[0] * (time + genomeData[7])) - (genomeData[1])).

+ to calculate-torque-2 at time (float):return .5 * (sin(genomeData[0] * (time + genomeData[8])) - (genomeData[2])).

+ to calculate-torque-3 at time (float):return .5 * (sin(genomeData[0] * (time + genomeData[9])) - (genomeData[3])).

+ to calculate-torque-4 at time (float):return .5 * (sin(genomeData[0] * (time + genomeData[10])) - (genomeData[4])).

+ to calculate-torque-5 at time (float):return .5 * (sin(genomeData[0] * (time + genomeData[11])) - (genomeData[5])).

+ to calculate-torque-6 at time (float):return .5 * (sin(genomeData[0] * (time + genomeData[12])) - (genomeData[6])).

+ to Tweak:n (int).

n = random[12].if n < 7: genomeData[n] = genomeData[n] + ((random[2.0] - 1.0)/30) .if n > 6: genomeData[n] = genomeData[n] + ((random[6.3] - 3.15)/30) .if n = 0: genomeData[n] = genomeData[n] + ((random[5.0] - 2.5)/30) .

n = random[12].if n < 7: genomeData[n] = genomeData[n] + ((random[2.0] - 1.0)/10) .if n > 6: genomeData[n] = genomeData[n] + ((random[6.3] - 3.15)/10) .if n = 0: genomeData[n] = genomeData[n] + ((random[5.0] - 2.5)/10) .

+ to Shake:n (int). for n=0, n<12, n+=1: { if n < 7: genomeData[n] = genomeData[n] + ((random[2.0] - 1.0)/30) .if n > 6: genomeData[n] = genomeData[n] + ((random[6.3] - 3.15)/30) .if n = 0: genomeData[n] = genomeData[n] + ((random[5.0] - 2.5)/30) . }

+ to mutate:n (int).

Page 20: Dowload Paper.doc.doc

n = random[12].if n < 7: genomeData[n] = random[2.0] - 1.0.if n > 6: genomeData[n] = random[6.3] - 3.15.if n = 0: genomeData[n] = random[5.0] - 2.5.}

MultiBody : TripodTemplate {+ variables:bodyLink (object).links (list). Color1 (vector). Color2 (vector).joints (list).

+ to get-root:return bodyLink.

+ to initlinkShape, FootShape, SupportShape , lowerLinkShape, bodyShape (object).

self add-menu named "Send to Center" for-method "center".

SupportShape = new Shape.SupportShape init-with-cube size (.16, 1, .16).

lowerLinkShape = new Shape.lowerLinkShape init-with-cube size (.26, 2.0, .26).

linkShape = new Shape.linkShape init-with-cube size (.28, 1.0, .28).

bodyShape = new Shape.bodyShape init-with-sphere radius (0.5). FootShape = new Shape.FootShape init-with-polygon-disk radius (0.5) sides (8) height (0.05).

Color1 = random[(1.0, 1.0, 1.0)]. Color2 = random[(1.0, 1.0, 1.0)].

links = 6 new Links.joints = 6 new RevoluteJoints.

links{0} set shape linkShape.links{0} set-color to Color1.links{2} set shape linkShape.links{2} set-color to Color1.links{4} set shape linkShape.links{4} set-color to Color1.

links{1} set shape lowerLinkShape.links{1} set-color to (1.0, 1.0, 1.0).links{3} set shape lowerLinkShape.links{3} set-color to (0.0, 1.0, 1.0).links{5} set shape lowerLinkShape.links{5} set-color to (1.0, 0.0, 1.0).

bodyLink = new Link.bodyLink set shape bodyShape.bodyLink set-color to Color2.

joints{0} link parent bodyLink to-child links{0}with-normal ( 1.5, 0, 1 )with-parent-point (0.722, 1, 0.552)

Page 21: Dowload Paper.doc.doc

with-child-point (0, 2.2, 0).joints{0} set-joint-limit-vectors min ( 0.7, 0, 0) max ( 1.3, 0, 0).joints{1} link parent links{0} to-child links{1}with-normal (-10, 0, 0)with-parent-point (0, -.5, 0)with-child-point (0, 1.0, 0).

joints{2} link parent bodyLink to-child links{2}with-normal (-1.5, 0, 1)with-parent-point ( -0.722, 1, 0.552)with-child-point (0, 2.2, 0). joints{2} set-joint-limit-vectors min ( 0.65, 0, 0) max ( 1.3, 0, 0).joints{3} link parent links{2} to-child links{3}with-normal (-10, 0, 0)with-parent-point (0, -.5, 0)with-child-point (0, 1.0, 0).

joints{4} link parent bodyLink to-child links{4}with-normal ( -1.5, 0, -1 )with-parent-point ( 0 , 1, -0.552)with-child-point (0, 2.2, 0).joints{4} set-joint-limit-vectors min ( 0.6, 0, 0) max ( 1.3, 0, 0).joints{5} link parent links{4} to-child links{5}with-normal (-10, 0, 0)with-parent-point (0, -.5, 0)with-child-point (0, 1.0, 0).

self register with-link bodyLink.#self rotate around-axis (0, 1, 0) by 1.57.

joints set-double-spring with-strength 30 with-max 1.5 with-min -2 .joints set-strength-limit to 40.

+ to Re-set: self unregister.

+ to center:currentLocation (vector).currentLocation = (self get-location).self move to (0, currentLocation::y, 0).

+ to set-color: Color3 (vector). Color4 (vector). Color3 = random[(1.0, 1.0, 1.0)]. Color4 = random[(1.0, 1.0, 1.0)]. links{0} set-color to Color3.links{2} set-color to Color3.links{4} set-color to Color3.links{1} set-color to Color4.links{3} set-color to Color4.links{5} set-color to Color4.

bodyLink set-color to Color4.

+ to set-joint-velocity-1 to value (float):joints{1} set-joint-velocity to value.

Page 22: Dowload Paper.doc.doc

+ to set-joint-velocity-2 to value (float):joints{2} set-joint-velocity to value.

+ to set-joint-velocity-3 to value (float):joints{3} set-joint-velocity to value.

+ to set-joint-velocity-4 to value (float): joints{4} set-joint-velocity to -value.

+ to set-joint-velocity-5 to value (float):joints{5} set-joint-velocity to -value.

+ to set-joint-velocity-6 to value (float):joints{0} set-joint-velocity to value.

+ to Re-set-shape: joints{0} set-joint-velocity to 2. joints{1} set-joint-velocity to 2. joints{2} set-joint-velocity to 2. joints{3} set-joint-velocity to 2. joints{4} set-joint-velocity to 2. joints{5} set-joint-velocity to 2.

+ to destroy:

free bodyLink.

free links{0} .free links{2} .free links{4} .free links{1} .free links{3} .free links{5} .

free joints{0} . free joints{1} . free joints{2} . free joints{3} . free joints{4} . free joints{5} ..}