uncertainty, risk, and information value in software requirements and architecture
DESCRIPTION
Slides from the ICSE 2014 presentation E. Letier, D. Stefan, E. T. Barr, Uncertainty, Risk, and Information Value in Software Requirements and Architecture, Proc. 36th International Conference on Software Engineering 2014 (ICSE 2014). Uncertainty complicates early requirements and architecture decisions and may expose a software project to significant risk. Yet software architects lack support for evaluating uncertainty, its impact on risk, and the value of reducing uncertainty before making critical decisions. We propose to apply decision analysis and multi-objective optimisation techniques to provide such support. We present a systematic method allowing software architects to describe uncertainty about the impact of alternatives on stakeholders' goals; to calculate the consequences of uncertainty through Monte-Carlo simulation; to shortlist candidate architectures based on expected costs, benefits and risks; and to assess the value of obtaining additional information before deciding. We demonstrate our method on the design of a system for coordinating emergency response teams. Our approach highlights the need for requirements engineering and software cost estimation methods to disclose uncertainty instead of hiding it.TRANSCRIPT
Uncertainty, Risk, and Information Value in Software Requirements and Architecture
Emmanuel Letier, David Stefan, Earl T. Barr University College London, UK
1 Hyderabad, 6 June 2014, ICSE 2014
Embrace uncertainty!
2
Software Design Decisions
Uncertainty is inevitable We must decide without knowing everything
3
What software to build? What
functions? What quality
level?
What architectural style? What
components and interfaces?
How to deploy them?
The Surfer’s Approach to Uncertainty
Instead of learning to
surf, conventional
organizations try to
control the waves. This
almost never works.
— Allen Ward
4
Mary Poppendieck, “Learning to Surf”, industry keynote @ ICSE2013
The Surfer’s Approach to Uncertainty
Instead of learning to
surf, conventional
organizations try to
control the waves. This
almost never works.
— Allen Ward
5
Mary Poppendieck, “Learning to Surf”, industry keynote @ ICSE2013
The Surfer’s Approach to Uncertainty
Instead of learning to
surf, conventional
organizations try to
control the waves. This
almost never works.
-- Allen Ward
6
The Scientific Approach to Uncertainty
Decision Analysis, a discipline
for understanding, formalising,
analysing, and communicating
insights about situations in
which important decisions
must be made
7
Ron Howard, Stanford
The Pseudo-Scientific Approach
Use formulae that resembles a scientific approach, except that
• the decision criteria are numbers without verifiable meaning
• the decision models are not falsifiable
• no retrospective evaluation of decisions and outcomes
Most widely used example, the Analytical Hierarchy Process (AHP)
8
What do we mean by uncertainty ?
9
Uncertainty
Uncertainty is the lack of complete knowledge about a state or quantity. There is more than one possible value and the “true” value is not known. Measurement of uncertainty. A set of possible values with a probability assigned to each.
10
Will I submit a paper to ICSE 2015?
yes no
0.8
0.2
How many papers will be submitted to ICSE 2015?
500 600 350
dist
ribut
ion
Key observation
11
For any uncertain quantity, we always know something even
if our uncertainty is large
How many auto rickshaw journeys in Hyderabad today?
10m 100k
12
Things Software Engineers Say ...
Clients don’t know what they want Requirements change is inevitable It’s not possible to discover the true requirements before building the system
13
Things Academics Say ...
Requirements are inherently
unknowable!
Linda Northrop “Does Scale Really Matter? – Ultra-Large-Scale Systems Seven Years after the Study” plenary keynote @ ICSE2013
What they mean ...
Requirements are uncertain
14
We always know something about the requirements for a system even if our uncertainty is large
Software Design Decisions
Uncertainty is inevitable We must decide without knowing everything
15
What software to build? What
functions? What quality
level?
What architectural style? What
components and interfaces?
How to deploy them?
Our approach in 6 steps
16
1. Model the design decision problem
Architecture decisions and impact on software qualities
Requirements decisions and impact on business value
Our approach in 6 steps
17
1. Model the design decision problem 2. Define the decision risks
What do we mean by risk?
Risk is a state of uncertainty where some of the possibilities involve a loss, catastrophe, or other undesirable outcome. Measurement of Risk. A set of possibilities each with quantified probabilities and quantified losses. (D. Hubbard)
18
Goal failure risk: software not meeting minimally acceptable level of goal satisfaction (e.g. unacceptable performance, reliability, availability, cost, etc.) Project failure risk: software not meeting minimally acceptable level for at least one of its goals
In our approach
Paper novelty: first method to quantify goal failure risks and project failure risk to inform software design decisions
Our approach in 6 steps
19
1. Model the design decision problem 2. Define the decision risks
3. Elicit what decision makers already know (their prior probability distributions)
We rely on simple, effective methods to counter cognitive biases (overconfidence, anchoring, etc.)
Our approach in 6 steps
20
1. Model the design decision problem 2. Define the decision risks
3. Elicit what decision makers already know (their prior probability distributions)
4. Shortlist candidate architectures based on expected value, expected cost, and risks
0.0 0.2 0.4 0.6 0.8 1.0
1214
1618
20
Project Failure risk
Expe
cted
Net
Ben
efit
Shortlisting candidate architectures
• Monte-Carlo simulation to
evaluate candidate architectures
under uncertainty
• Multi-objective optimisation to
find shortlist (shortlist = set of
Pareto-optimal candidates)
21
• Paper novelties (for SBSE experts): - Pareto strip = Pareto front with uncertainty margins - Identification of closed and open decisions in shortlist
Design space: ~ 7,000 candidates Shortlist (in red): 10 candidates
Our approach in 6 steps
22
1. Model the design decision problem 2. Define the decision risks
3. Elicit what decision makers already know (their prior probability distributions)
4. Shortlist candidate architectures based on expected value, expected cost, and risks
5. Compute the expected value of information
Should we seek additional info about shortlisted candidates?
The Expected Value of Perfect Information (EVPI)
• Useful against measurement inversion bias: measuring
what we can be precise about rather than what is most
valuable to decision
• Paper novelty: computing impact of perfect information on
risk
23
EVPI(X) = the expected gain in business value ($) from obtaining perfect information about X to inform decision
(Ronald Howard, 1966)
How much would you be willing to pay for perfect information about X?
Our approach in 6 steps
24
1. Model the design decision problem 2. Define the decision risks
3. Elicit what decision makers already know (their prior probability distributions)
4. Shortlist candidate architectures based on expected value, expected cost, and risks
5. Compute the expected value of information
6. Seek additional info where valuable (creates posterior distributions)
Should we seek additional info about shortlisted candidates?
Application to an example from ICSE’13
A mobile system for coordinating emergency rescue teams • Design space: 10 design
decisions; around 7,000 candidate architectures
• Objectives: Cost, Response Time, Reliability, Battery Life, ...
• Models given by design team: Utility score defined as weighted sum of objectives satisfaction
• Lessons Learnt – Measuring risks leads to a
different shortlist – Need to reason about model
uncertainty in addition to parameter uncertainty
– Decision models must be falsifiable
25
(Esfahani, Malek, & Razavi)
Conclusion
26
Research Roadmap
27
Scientific Approach to Software Decisions
Parameter uncertainty
Model uncertainty: quantifying “good enough”
Incremental value delivery
Showing applicability
Showing cost-effectiveness Overcoming cultural barriers
???
Incremental evidence-based model tuning
Please, help us!
28
We need more research on Software Decision Analysis
‒ Our ‘moda’ R package is available to facilitate uptake
‒ Related work: CBAM @ ICSE’01, ICSE’03; GuideArch
@ ICSE’13; Fenton et al @ ICSE’04
We need industry partners willing to try scientific approach to decision making
– portfolio decisions – release planning – architecture decisions – risk-based testing – process decisions
A Call to Action
Who do you want to inform critical IT decisions?
29
Uncertainty will be at the heart of many important decisions for the 21st Century
The Surfers The Scientists The Pseudo-Scientists