how can multicriteria decision making (mcdm) help in integrated resource planning? lessons from...

31
How Can How Can Multicriteria Decision Making Multicriteria Decision Making (MCDM) Help in Integrated (MCDM) Help in Integrated Resource Planning? Resource Planning? Lessons from Practice Lessons from Practice Benjamin F. Hobbs, [email protected] Dept. Geography & Environmental Engineering Whiting School of Engineering The Johns Hopkins University Part 1: Workshop on Multicriteria Methods in IRP Dublin, Eire, 1 Feb. 2005 Sources: Hobbs & Meier, Sources: Hobbs & Meier, Energy Decisions & The Environment: A Guide to the Use of Energy Decisions & The Environment: A Guide to the Use of Multicriteria Methods Multicriteria Methods , Kluwer, 2000; , Kluwer, 2000; IEEE Trans. on Power Systems IEEE Trans. on Power Systems , , 9 (4), Nov. 1994 (4), Nov. 1994

Upload: warren-golden

Post on 18-Dec-2015

221 views

Category:

Documents


0 download

TRANSCRIPT

How Can How Can Multicriteria Decision Making (MCDM) Multicriteria Decision Making (MCDM) Help in Integrated Resource Planning?Help in Integrated Resource Planning?

Lessons from PracticeLessons from PracticeBenjamin F. Hobbs, [email protected]

Dept. Geography & Environmental EngineeringWhiting School of EngineeringThe Johns Hopkins University

Part 1: Workshop on Multicriteria Methods in IRPDublin, Eire, 1 Feb. 2005

Sources: Hobbs & Meier, Sources: Hobbs & Meier, Energy Decisions & The Environment: A Guide to the Use of Energy Decisions & The Environment: A Guide to the Use of Multicriteria MethodsMulticriteria Methods, Kluwer, 2000; , Kluwer, 2000; IEEE Trans. on Power SystemsIEEE Trans. on Power Systems, , 99(4), Nov. 1994(4), Nov. 1994

Outline, Part 1: General LessonsOutline, Part 1: General Lessons• Products:

– Tradeoff descriptions– Quantified & documented valuations

• MCDM complements economics• Lessons from practice

1. MCDM can help build consensus2. Focus on simplicity, clarity, & user control

MCDM methods can be biased, & different people like different approaches 3. Avoid oversimplistic analysis4. Use >1 method: greater confidence, validity

Part 2: Detailed Case Studies

Thumbnail SketchThumbnail Sketch The Multicriteria Problem:

– Many objectives ...– Many alternatives ...– Uncertainty ...– Disagreeing interests...

MCDM Analysis: Formal comparison of alternatives considering multiple criteria:– Information (tradeoffs, risks) display– Value judgments (acceptable tradeoffs, risks)– Computation– Negotiation

MCDM can facilitate understanding and negotiation

Tradeoffs & UncertaintiesTradeoffs & Uncertainties

vs.vs.

$ Cost

Impact

$ Cost

Impact

Display ApproachesDisplay Approaches

Tables ofNumbers

Consumers’ReportsTables

X-YPlots

ValuePaths

TradeoffTradeoffDisplaysDisplays

Challenges: •Finding efficient (“Pareto”) alternatives

(especially for large policy models)•Displaying tradeoffs in many dimensions

Example Display: Example Display: Impact Statement Impact Statement

for the for the Puget Sound Puget Sound

(Seattle, WA, USA) (Seattle, WA, USA) Electric Reliability Electric Reliability

PlanPlan

Challenges: •Many alternatives, criteria•Qualitative

Example Cartesian Plot Display:Example Cartesian Plot Display:Seattle City Light Cost vs. COSeattle City Light Cost vs. CO22 Tradeoffs Tradeoffs

Challenge: •Can’t handle > 2 or 3 criteria

Simple Example of Quantitative Simple Example of Quantitative Value JudgmentsValue Judgments

The 3 steps of Additive Value Functions (“Rating & Weighting”)

Value ScalingValue Scaling WeightingWeighting

wCost = 70

wJobs = 90

wO3 = 20

AmalgamationAmalgamation

Overall Value =

70*v(Cost)

+90*v(Jobs)

+20*v(O3) +...

v(O3 )

0

1

O3

Weight Selection Examples 1995 IRP

– 12 criteria using both nonhierarchical & hierarchical Point Allocation (100 points)

1995 DSM Plan– 9 criteria using Ratio questioning/swing weighting

(“which criteria would you rather swing from its worst to best value”)

– Tradeoff weighting

Weighting Exercise

Weighting Exercise 2

Examples of Amalgamation: Additive Value Functions for

Environmental Impact

Montana Power Company– “Resource Environmental Assessment Matrix”: 16 criteria (each

on 0-4 scale) weighted and summed; used for resource comparison in IRP

– “Environmental Performance Index”: measure of performance towards four environmental goals (compliance, releases, resource consumption, remediation)

– 25 criteria (each on [-1,0] or [-1,1] scale) weighted and summed

Sample Valuation ApproachesSample Valuation Approaches

“American” Approach(Value Functions)

“European”Approach(Pairwise

Comparisons)

GoalProgramming

TradeoffTradeoffValuationValuation

SimpleRating &

Weighting

ELECTRENonlinearValue

Tradeoff-based

Weighting

AnalyticHierarchyProcess

Challenges: •Promoting learning and confidence (when people are unsure of what they want)

•Reliably eliciting preferences (when people do know what they want)

Why Not Monetize?Why Not Monetize?

Advantages of Monetization:– Well-developed theory of social preference– Refined methods that, in theory, can be

validated & repeated– Considers preferences of entire population– Promotes cost-effective mitigation by

imposing consistency across jurisdictions

Disadvantages ofDisadvantages ofMonetizationMonetization

Many impacts difficult to monetize Monetization hides assumptions &

fundamental nature of process (negotiation among interests); Puts value judgments in hands of analysts

Premises of Benefit/Cost analysis questioned distribution unimportant? only what people want matters?

“Policy makers and regulators should not use monetary values to disguise issues, either from themselves or from the public. When they face difficult moral choices, they should not pretend they are making objective financial decisions or misuse or manipulate monetary values to justify them.”

(D. Dodds and J. Lesser, Monetization and Quantification of Environmental Impacts,” Washington State Energy Office, 1992)

MCDM Analysis:MCDM Analysis:A Complement to MonetizationA Complement to Monetization

Quantitative consideration of multiple criteria & risks can help in several ways:– Display tradeoffs (understand problem)– Make value judgments easier, more consistent

(psychologists’ findings)– Quickly screen “losers,” highlight possible “winners”– Communicate priorities of interests, aid search for

consensus– Document assumptions, results; facilitate sensitivity

analysis

Monetization vs. MCDM Recommendations

Emphasize monetization when:– environmental costs are mainly internal– defensible damage estimates exist– uniformity among jurisdictions desired

But still show tradeoffs, sensitivity to values

Emphasize multicriteria analysis if:– fundamental value conflicts exist among groups– public decision problems in unique circumstances– $ estimates problematic

But use $ estimates as “reality” checks

Dangers of Careless MCDM AnalysisDangers of Careless MCDM Analysis

Overemphasize the quantifiable Unrepresentative Oversimplify, distort tradeoffs & preferences Opaqueness, jargon loss of insight Doesn’t recognize that people not sure what

they want Too explicit for the political process

4 Lessons From Practice4 Lessons From Practice1. MCDM can help build consensus

Focus on basic objectives helps bargaining, compromise

E.g., US Bureau of ReclamationCentral AZ Water Control Study (C. Brown, Water Resources Bulletin,

1984)

Congressman Mo Udall: “The Orme Dam was a critically important issue to Arizona. But we finally ended up, to my utter amazement, with the whole Arizona establishment agreeing we really didn’t want the dam”

Jackson Lake Dam Safety Study (C. Brown, in Managing Water Related

Conflicts: The Engineers Role, ASCE, 1989)

Consensus achieved: 7 of 8 groups rated reconstruction best of 7 options under 14 criteria

Holistic (unaided) evaluations greatly disagree

Rating & weighting: balanced consideration yields more agreement, clarification of remaining disagreements

Alternatives

Best

Worst

RA

NK

S O

F

AL

TE

RN

AT

IVE

S

Lessons, Cont.Lessons, Cont.

2. Focus on clarity, openness, user control: Users distrust black-boxes, hurried processes

BC Gas resource ranking successful because: Sufficient resources & time Responsive to stakeholder questions, concerns Focus was on objectives Methods were simple, and link between value judgments and

recommendations was clear (Hobbs & Horn, Energy Policy, 1997)

Einstein: “Make things as simple as possible, but no simpler”

Second Weighting Exercise

Lessons Learned, Cont.Lessons Learned, Cont.Results from experiments in realistic contexts (Hobbs & Meier, Energy Decisions

& the Environment: A Guide to the Use of Multicriteria Methods, Kluwer, 2000; Hobbs et al., Water Resources Research, 1992)

People prefer different methods6 of 11 of participants in Seattle City Light study preferred Goal Programming.

But most Army Corps of Engineers planners preferred value functions

Value elicitation methods can be biased; distort preferences; give different resultsManagers, stakeholders exhibit classic biasesHow you value can matter as much as who valuesUSEPA Climate Policy Workshop Participants: All cost weights based on $/ton

higher than all weights based on 0-100 rating of 10% change in objectives (p<0.03)

What are Valid Weights?

Weights should be ratio scaled Weights should be based on willingness to

tradeoff criteria Weights should therefore be sensitive to range

of criteria Experiments at Seattle City Light, Centerior

Energy, US Army Corps of Engineers, elsewhere show that no one method for choosing weights is valid and appropriate for everyone

E.g., Weights Should Not Depend on Hierarchy

Mean w12 = 5.8%

Mean w12 = 16%

E.g., Point Allocation Lacks Validity

Criteria weights should not depend on hierarchy’s structure

Nonhierarchical weights “flatter” than hierarchical weights. Why? We tend to give somewhat the same

weight to all criteria in a group Hierarchical weights smaller for criteria

belonging to groups with many criteria

Lessons Learned, Cont.Lessons Learned, Cont.

Experimental results imply lessons 3 & 4:

3. Avoid simplistic valuations that duck “hard questions”

Tradeoffs are hard to make: if a method is easy, you won’t think & learn

“0-100” scales of “importance” yield weights with predictable biases

Time required: 3 hours (Cleveland Hazmat Routing study) to 4 days (BC Gas) for informed groups

Lessons Learned, Cont.Lessons Learned, Cont.

4. Use two or more valuation methods, and resolve inconsistencies:

Makes you think Builds confidence in the results Is recommended by managers &

stakeholders

Conclusion, Part 1Conclusion, Part 1

Multicriteria methods can help IRP by: Focusing discussion on fundamental objectives and tradeoffs

facilitates negotiation and compromise Providing a systematic way to explore and express values

promotes consistency, confidence, & full consideration of all impacts

Lessons: Clarity & user control crucial Don’t oversimplify Use more than one method