evolutionary computational intelligence
DESCRIPTION
Evolutionary Computational Intelligence. Lecture 9: Noisy Fitness. Ferrante Neri University of Jyväskylä. Real world optimization problems. Many real world optimization problems are characterized by uncertainties - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/1.jpg)
Evolutionary Computational Intelligence
Lecture 9: Noisy Fitness
Ferrante Neri
University of Jyväskylä
![Page 2: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/2.jpg)
Real world optimization problems
Many real world optimization problems are characterized by uncertainties
This means that the same solutions takes different fitness values on the basis of the time when it is calculated
![Page 3: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/3.jpg)
Classification of uncertainties
Uncertainties in optimization can be categorized into three classes.
Noisy fitness function Approximated fitness function Robustness
![Page 4: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/4.jpg)
Noisy Fitness
Noise in fitness evaluations may come from many different sources such as sensory measurement errors or randomized simulations.
Example: optimization based on expereimental setup. Motor drive
![Page 5: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/5.jpg)
Approximated Fitness Function
When the fitness function is very expensive to evaluate, or an analytical fitness function is not available, approximated fitness functions are often used instead.
These approximated models implicitly introduce a noise which is the difference between the approximated value and real fitness value, which is unknown.
![Page 6: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/6.jpg)
Perturbation in the environment
Often, when a solution is implemented, the design variables or the environmental parameters are subject to perturbations or changes
Example satellite problem: due to the movement of the earth we are having some changes in the fitness values of the same solution
![Page 7: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/7.jpg)
General formulation of uncertain problem
A classical formulation of a noisy/uncertain fitness is given by:
We are not really interested in the fact the noise is Gaussian but it is fundamental that the noise is zero mean!!
![Page 8: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/8.jpg)
Zero mean: Explicit Averaging
If the noise is zero mean, it is true that the average over a certain number of samples generates a ”good” estimation of the actual fitness values
Thus, the most classical approach tends to compute the fitness each solution a certain number of times (samples) and then calculate the average
![Page 9: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/9.jpg)
Failing of deterministic algorithms
The noise introduces some ”false optima” in the fitness landscape and obviously a method which employs implicit or explicit information about the gradient can likely fail
The estimation of the neighborhood cannot be properly done because the search is misled by the noise
![Page 10: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/10.jpg)
Better success of EAs
Evolutionary algorithms, due to their inner structure, so not perform comparison among neighbors and thus showed to be better performing in noisy environment
Some recent papers are in fact stating that even rather standard EAs (e.g. self-adaptive ES) can lead to good results in noisy environment
![Page 11: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/11.jpg)
Not universal success of EAs
This success is only restricted to specific cases and it strongly depends on the problem under examination
EAs, like all the optimization algorithms, contain some comparison amongst solutions in order to determine which one is better and which one is worse
In EAs this role is given to parent and survivor selection
![Page 12: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/12.jpg)
Population based: Implicit Averaging
EAs are population based algorithms thus another kind of averaging can be carried out
Many scientists observed that large population size is efficient in defeating the noise since it is given a chance to calculate several neighbor solutions and thus detect promising areas
![Page 13: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/13.jpg)
Another kind of averaging
Explicit and Implicit Averaging are in the class of averaging over the time
Branke proposed averaging over the space:to calculate the fitness by averaging over the neighborhood of the point to be evaluated
Implicit assumption: the noise in the neighborhood has the same characteristics as the noise at the point to be evaluated, and that the fitness landscape is locally smooth. This is not always true!!! E.g. systems with instable regions
![Page 14: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/14.jpg)
High computational cost
It is clear that an averaging operation (most of all over the time), requires extra fitness evaluations and thus an increase of computational overhead
In some cases, in order to have reliable results it is necessary to spend a lot of efforts
![Page 15: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/15.jpg)
Adaptive Averaging
example:
Explicit
Inplicit
min 1, best avg
best
f f
f
1f vpop pop popS S S
![Page 16: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/16.jpg)
Prudent-daring survivor selection
If not the individuals are re-sampled I can apply
Two cooperative selection schemes– Prudent: selects individuals which are reliable (re-
sampled) and fairly promising– Daring: selects individuals which are unreliable
(fitness calculated only once) but look very promising
Reliable solution + computational saving
![Page 17: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/17.jpg)
Adaptive Prudent Daring Evolutionary Algorithm
![Page 18: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/18.jpg)
APDEA Results
![Page 19: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/19.jpg)
Tolerance Interval 1/2
noise is Gaussian and that its standard deviation has the same constant value
tolerance interval in the case of Gaussian distribution
![Page 20: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/20.jpg)
Tolerance Interval 2/2
If solution A is better than B of quantity equal to the half of the width of the tolerance interval, it is surely better (with a certain confidence level)
If the distance in the fitness is smaller, then a re-sampling is required
![Page 21: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/21.jpg)
Adaptive Tolerant Evolutionary Algorithm
![Page 22: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/22.jpg)
Comparison APDEA vs. ATEA
![Page 23: Evolutionary Computational Intelligence](https://reader035.vdocuments.us/reader035/viewer/2022081503/56812ad3550346895d8eb3b5/html5/thumbnails/23.jpg)
Comparative analysis
APDEA is more general since it requires only that the noise is zero mean
APDEA is better performing in terms of convergence velocity
ATEA requires a preliminary analysis APDEA requires a more extensive parameter
setting