analysis of simulation results chapter 25. overview analysis of simulation results model...
TRANSCRIPT
Analysis ofSimulation Results
Chapter 25
Overview
Analysis of Simulation Results Model Verification TechniquesModel Validation TechniquesTransient RemovalTerminating SimulationsStopping Criteria: Variance EstimationVariance Reduction
Model Verification vs. Validation
Model Verification Techniques
5
Would like model output to be close to that of real system Made assumptions about behavior of real systems 1st step, test if assumptions are reasonable
Validation, or representativeness of assumptions 2nd step, test whether model implements assumptions
Verification, or correctness Mutually exclusive.
Ex: what was your project 1?
Analysis of Simulation ResultsAlways assume that your assumption is invalid.
– Robert F. Tatman
Top Down Modular Design
Antibugging
Antibugging consists of including additional checks and outputs in the program that will point out the bugs, if any.
for example, the model counts the number of packets sent by a number of source nodes as well as the number received by the destination nodes. Say, total packets = packets sent + packets received If not, can halt or warn
Structured Walk-Through
Structured walk-through consists of explaining the code to another person or a group. The code developer explains what each line of code does
Deterministic Models
The key problem in debugging simulation models is the randomness of variables.
a deterministic program is easier to debug than a program with random variables
But by specifying constant (deterministic) distributions, the user can easily determine output variables and thus debug the modules
Run Simplified Cases
The model may be run with simple casesOf course, a model that works for simple cases
is not guaranteed to work for more complex cases. Therefore, the test cases should be as complex as can be easily analyzed without simulation
Only one packet Only one source Only one intermediate node
Trace
On-Line Graphic Displays
13
Slight change in input should yield slight change in output, otherwise error and bug in the simulation
Continuity tests
Th
rput
(Debugged)
Th
rput
(Undebugged)
Degeneracy tests
• Try extreme configuration and workloads. Try extremes (lowest and highest) since may reveal bugs One CPU, Zero disk
15
Consistency tests – similar inputs produce similar outputsEx: 2 sources at 50 pkts/sec produce same total as
1 source at 100 pkts/secSeed independence – random number
generator starting value should not affect final conclusion (maybe individual output, but not overall conclusion)
More Model Verification Techniques
16
Ensure assumptions used are reasonable Want final simulated system to be like real system
Unlike verification, techniques to validate one simulation may be different from one model to another
Three key aspects to validate:1. Assumptions2. Input parameter values and distributions3. Output values and conclusions
Compare validity of each to one or more of:1. Expert intuition2. Real system measurements3. Theoretical results
Model Validation Techniques
9 combinations- Not all are always possible, however
17
• Most practical, most common
• “Brainstorm” with people knowledgeable in area
• Present measured results and compare to simulated results (can see if experts can tell the difference)
Model Validation Techniques - Expert Intuition
Th
roughput
Packet Loss Probability
0.2 0.4 0.8
Which alternativelooks invalid? Why?
18
Most reliable and preferred May be unfeasible because system does not exist or too
expensive to measure That could be why simulating in the first place!
But even one or two measurements add an enormous amount to the validity of the simulation
Should compare input values, output values, workload characterization Use multiple traces for trace-driven simulations
Can use statistical techniques (confidence intervals) to determine if simulated values different than measured values
Model Validation Techniques - Real System Measurements
Measurement
Three measurementMeasurement and simulationAnalytical model and simulationModel and measurement
Theoretical Results
Model and simulation are wrongAnalysis = SimulationUsed to validate analysis alsoBoth may be invalidUse theory in conjunction with experts'
intuition E.g., Use theory for a large configurationCan show that the model is not invalid
Exercise Imagine that you have been called as an expert to review a simulation study. Which of the following simulation results
would you consider non-intuitive and would want it carefully
validated: 1. The throughput of a system increases as its load increases. 2. The throughput of a system decreases as its load increases. 3. The response time increases as the load increases. 4. The response time of a system decreases as its load
increases. 5. The loss rate of a system decreases as the load increases.
22
IntroductionCommon Mistakes in SimulationTerminologySelecting a Simulation LanguageTypes of SimulationsVerification and ValidationTransient RemovalTermination
Outline
23
Most simulations only want steady stateRemove initial transient state
Trouble is, not possible to define exactly what constitutes end of transient state
Use heuristics:Long runsProper initializationTruncationInitial data deletionMoving average of replicationsBatch means
Transient Removal
24
Use very long runsEffects of transient state will be amortizedBut … wastes resourcesAnd tough to choose how long is “enough”Recommendation … don’t use long runs alone
Long Runs
25
Start simulation in state close to expected stateEx: CPU scheduler may start with some jobs in
the queueDetermine starting conditions by previous
simulations or simple analysisMay result in decreased run length, but still
may not provide confidence that are in stable condition
Proper Initialization
26
• Assume variability during steady state is less than during transient state
• Variability measured in terms of range – (min, max)
• If a trajectory of range stabilizes, then assume that in stable state
• Method:– Given n observations {x1,
x2, …, xn} – ignore first l observations– Calculate (min,max) of
remaining n-l– Repeat for l = 1…n– Stop when l+1th
observation is neither min nor max
Truncation
(Example next)
27
• Sequence: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 10, 9, 10, 11, 10, 9…
• Ignore first (l=1), range is (2, 11) and 2nd observation (l+1) is the min
• Ignore second (l=2), range is (3,11) and 3rd observation (l+1) is min
• Finally, l=9 and range is (9,11) and 10th observation is neither min nor max
• So, discard first 9 observations
Truncation Example
TransientInterval
28
• Find duration of transient interval for:11, 4, 2, 6, 5, 7, 10, 9, 10,
9, 10, 9, 10
Truncation Example 2 (1 of 2)
29
• Find duration of transient interval for:11, 4, 2, 6, 5, 7, 10, 9, 10,
9, 10, 9, 10• When l=3, range is
(5,10) and 4th (6) is not min or max
• So, discard only 3 instead of 6
Truncation Example 2
“Real” transient
Assumed transient
Replication
Change seed so you have replication in simulation
If the seed is the same, it is not.Do some replication to smooth the average
31
Study average after some initial observations are deleted from sampleIf average does not change much, must be deleting
from steady stateHowever, since randomness can cause some
fluctuations during steady state, need multiple runs (w/different seeds)
Given m replications size n each with xij jth observation of ith replicationNote j varies along time axis and i varies across
replications
Initial Data Deletion
32
Initial Data Deletion (cont)
Initial Data Deletion (cont)
Initial Data Deletion (cont)
35
Initial Data Deletion
xij
j
xj
j
xl
l
(xl –
x)
/ x
l
transientinterval
knee
Individual replicationMean Across replications
Mean of last n-l observationRelative Change
Batch Means
Run a long simulation and divide into equal duration part
Part = Batch = Sub-sample Study variance of batch means as a function
of the batch sizeR
esp
on
ses
Observation number
n 2n 3n 4n 5n
Batch Means (cont)
Batch Means (cont)
Batch Means
Ignore peaks followed by an upswing
Vari
ance
of
batc
h m
eans
transientinterval
Batch size n
(Ignore)