intarese uncertainty training 17-18 oct 2007 knowledge quality assessment an introduction

52
Universiteit Utrec pernicus Institute INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction Centre d'Economie et d'Ethique pour l'Environnement et le Développement, Université de Versailles Saint-Quentin-en-Yvelines, France Dr. Jeroen van der Sluijs Copernicus Institute for Sustainable Development and Innovation Utrecht University &

Upload: raheem

Post on 12-Jan-2016

25 views

Category:

Documents


0 download

DESCRIPTION

INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction. Dr. Jeroen van der Sluijs. Copernicus Institute for Sustainable Development and Innovation Utrecht University. &. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Universiteit Utrecht

Copernicus InstituteINTARESE Uncertainty Training 17-18 Oct 2007

Knowledge Quality Assessment

an introduction

Centre d'Economie et d'Ethique pour l'Environnement et le Développement, Université de Versailles Saint-Quentin-en-Yvelines, France

Dr. Jeroen van der Sluijs

Copernicus Institute for Sustainable Development and InnovationUtrecht University

&

Page 2: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Jeroen van der Sluijs; Ragnar Fjelland; Jerome Ravetz; Anne Ingeborg Myhr; Roger Strand; Silvio Funtowicz; Kamilla Kjølberg; Kjellrun Hiis Hauge; Bruna De Marchi; Andrea Saltelli

Haugastøl group:

Page 3: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Complex - uncertain - risksTypical characteristics (Funtowicz & Ravetz):• Decisions will need to be made before conclusive

scientific evidence is available;• Potential impacts of ‘wrong’ decisions can be huge • Values are in dispute • Knowledge base is characterized by large (partly

irreducible, largely unquantifiable) uncertainties, multi-causality, knowledge gaps, and imperfect understanding;

• More research less uncertainty; unforeseen complexities!• Assessment dominated by models, scenarios,

assumptions, extrapolations• Many (hidden) value loadings reside in problem frames,

indicators chosen, assumptions made

Page 4: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Model structure uncertainty...

5 consultants, each using a different model were given the same question:“which parts of this particular area are most vulnerableto pollution and need to be protected?”

(Refsgaard et al, 2006)

Page 5: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

3 paradigms of uncertain risks'deficit view'• Uncertainty is provisional• Reduce uncertainty, make ever more complex models• Tools: quantification, Monte Carlo, Bayesian belief networks

'evidence evaluation view'• Comparative evaluations of research results• Tools: Scientific consensus building; multi disciplinary expert panels• focus on robust findings

'complex systems view / post-normal view'• Uncertainty is intrinsic to complex systems• Uncertainty can be result of production of knowledge• Acknowledge that not all uncertainties can be quantified• Openly deal with deeper dimensions of uncertainty

(problem framing indeterminacy, ignorance, assumptions, value loadings, institutional dimensions)

• Tools: Knowledge Quality Assessment• Deliberative negotiated management of risk

Page 6: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

- Practical problems: problems for which the solution consist of the achievement of human purposes.

- Technical problems: defined in terms of the function to be performed.

In modern societies, practical problems are reduced to a set of technical problems.

Ravetz, 1971: Scientific Knowledge and its Social Problems

Practical/Technical problems

Page 7: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Uncertainty in knowledge based society: the problems

1984 Keepin & Wynne:

“Despite the appearance of analytical rigour, IIASA’s widely acclaimed global energy projections are highly unstable and based on informal guesswork. This results from inadequate peer review and quality control, raising questions about political bias in scientific analysis.”

Page 8: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Once environmental numbers are thrown over the disciplinary fence, important caveats tend to be ignored, uncertainties compressed and numbers used at face value

e.g. Climate Sensitivity, see Van der Sluijs, Wynne, Shackley, 1998:

1.5-4.5 °C ?!

Crossing the disciplinary boundaries

Resulting misconception:

Worst case = 4.5°C

Page 9: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

The certainty trough(McKenzie, 1990)

Page 10: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Insights on uncertainty• More research tends to increase uncertainty

– reveals unforeseen complexities– Complex systems exhibit irreducible uncertainty (intrinsic

or practically)• Omitting uncertainty management can lead to scandals,

crisis and loss of trust in science and institutions• In many complex problems unquantifiable uncertainties

dominate the quantifiable uncertainty• High quality low uncertainty• Quality relates to fitness for function (robustness, PP)• Shift in focus needed from reducing uncertainty towards

reflective methods to explicitly cope with uncertainty and quality

Page 11: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Clark & Majone 1985

Critical Appraisal of Scientific Inquiries with Policy Implications1. Criticism by whom?Critical roles• Scientist• Peer group• Program Manager or Sponsor• Policy maker• Public interests groups

Page 12: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Clark & Majone 1985

Criticism of what?Critical modes:• Input

– data; methods, people, competence, (im)matureness of field

• Output– problem solved? hypothesis tested?

• Process– good scientific practice, procedures for

review, documenting etc.

Page 13: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

(Clark & Majone, 1985)

Page 14: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Clark & Majone 1985Meta quality criteria:• Adequacy

– reliability, reproducibility, uncertainty analysis etc.

• Value– Internal: how well is the study carried out?– External: fitness for purpose, fitness for function– Personal: subjectivity, preferences, choicesd, assumptions,

bias

• Effectiveness– Does it help to solve practical problems

• Legitimacy– numinous: natural authority, independance, credibility,

competence– civil: agreed procedures

Page 15: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

KQA tools• Quantitative methods

– SA/UA Monte Carlo

• Uncertainty typology (matrix)• Quality assessment

– Pedigree analysis (NUSAP)– Assumption analysis– Model Quality Checklist– MNP Uncertainty Guidance– Extended Peer Review– Argumentative Discourse Analysis (ADA); Critical Discourse

Analysis (CDA)– ....

Page 16: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

NL Environmental Assessment Agency (RIVM/MNP) Guidance: Systematic reflection on uncertainty & quality in:

Page 17: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Systematic reflection on uncertainty issues

in:• Problem framing• Involvement of stakeholders• Selection of indicators• Appraisal of knowledge base• Mapping and assessment of relevant

uncertainties• Reporting of uncertainty information

Page 18: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Problem framing and context

• Explore rival problem frames• Relevant aspects / system boundary • Typify problem structure• Problem lifecycle / maturity• Role of study in policy process• Uncertainty in socio-political context

Page 19: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Type-III error: Assessing the wrong problem by incorrectly accepting the falsemeta-hypothesis that there is no difference between the boundaries of a problem, as defined by the analyst, and the actual boundaries of the problem (Dunn, 1997).

Context validation (Dunn, 1999). The validity of inferences that we have estimated the proximal range of rival hypotheses.

Context validation can be performed by a participatory bottom-up process to elicit from scientists and stakeholders rival hypotheses on causal relations underlying a problem and rival problem definitions.

Page 20: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

What is the role of the assessment in the policy process?

• ad hoc policy advice• to evaluate existing policy• to evaluate proposed policy• to foster recognition of new problems• to identify and/or evaluate possible

solutions• to provide counter-expertise• other

Page 21: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

In different phases of problem lifecycle, different uncertainties are salient

Page 22: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Different problem-types need different uncertainty management strategies

Page 23: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Consensus about valuesNo Yes

Consensus about knowledge

No

Unstructured• Ignorance

• Value-ladenness• Problem framing • Scenario uncertainty

• Public debate• Conflict management• Reflexive science.

Moderately structured (ends)

• Unreliability• Scenario uncertainty• Ignorance

• Stakeholder involvement• Extended peer review

Yes

Moderately structured (means)

• Value ladenness• Strategic knowledge use

• Accomodation • Reflexive science.

Structured• Statistical uncertainty

• Normal scientific procedures• Statistical approaches

Page 24: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Systematic reflection on uncertainty issues

in:• Problem framing• Involvement of stakeholders• Selection of indicators• Appraisal of knowledge base• Mapping and assessment of relevant

uncertainties• Reporting of uncertainty information

Page 25: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Involvement of stakeholders

• Identify relevant stakeholders.• Identification of areas of agreement and

disagreement among stakeholders on value dimensions of the problem.

• Recommendations on when to involve different stakeholders in the assessment process.

Page 26: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Roles of stakeholders

• (Co-) definer of the problems to be addressed– What knowledge is relevant?

• Source of knowledge• Quality control of the science (for

instance: review of assumptions)

Page 27: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Systematic reflection on uncertainty issues

in:• Problem framing• Involvement of stakeholders• Selection of indicators• Appraisal of knowledge base• Mapping and assessment of relevant

uncertainties• Reporting of uncertainty information

Page 28: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Indicators

• How well do indicators used address key aspects of the problem?

• Use of proxies• Alternative indicators?• Limitations of indicators used? • Scale and aggregation issues• Controversies in science and society about

these indicators?

Page 29: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Example: imagine the inference is Y = the logarithm of the ratio between the two pressure-on-decision indices PI1 and PI2

Y=Log(PI 1/PI 2)

Region where Region whereIncineration Landfillis preferred is preferred

Frequency of occurrence

High uncertainty is not the same as low quality

Page 30: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

High uncertainty is not the same as low quality,

but..... methodological uncertainty can de dominant

(slide borrowed from Andrea Saltelli)

Page 31: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Systematic reflection on uncertainty issues

in:• Problem framing• Involvement of stakeholders• Selection of indicators• Appraisal of knowledge base• Mapping and assessment of relevant

uncertainties• Reporting of uncertainty information

Page 32: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Adequacy ofavailable knowledge base?• What are strong and weak points in the

knowledgebase?– Use of proxies, empirical basis, theoretical

understanding, methodological rigor, validationNUSAP Pedigree analysis

• What parts of the knowledge are contested (scientific and societal controversies)?

• Is the assessment feasible in view of available resources? (limitations implied)

Page 33: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Dimensions of uncertainty

• Technical (inexactness)• Methodological (unreliability)• Epistemological (ignorance)• Societal (limited social

robustness)

Page 34: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Reliability intervals in case of normal distributions = 68 % 2 = 95 %

3 = 99.7 %

Page 35: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Page 36: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Total NH3 emission in 1995 as reported in successive SotE reports

0

50

100

150

200

250

1996 1997 1998 1999 2000 2001 2002

Year of State of Environment Report

mlj

kg a

mm

onia

k

95%confidence-interval

Page 37: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

NUSAP Qualified Quantities

• Numeral • Unit• Spread • Assessment • Pedigree

(Funtowicz and Ravetz, 1990)

Page 38: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

NUSAP: Pedigree

Evaluates the strength of the number by looking at:

• Background history by which the number was produced

• Underpinning and scientific status of the number

Page 39: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Code Proxy Empirical Theoretical basis Method Validation

4 Exactmeasure

Large sampledirect mmts

Well establishedtheory

Best availablepractice

Compared withindep. mmts ofsame variable

3 Good fit ormeasure

Small sampledirect mmts

Accepted theorypartial in nature

Reliable methodcommonlyaccepted

Compared withindep. mmts ofclosely relatedvariable

2 Wellcorrelated

Modeled/deriveddata

Partial theorylimitedconsensus onreliability

Acceptablemethod limitedconsensus onreliability

Compared withmmts notindependent

1 Weakcorrelation

Educated guesses/ rule of thumbest

Preliminarytheory

Preliminarymethodsunknownreliability

Weak / indirectvalidation

0 Not clearlyrelated

Crudespeculation

Crudespeculation

No discerniblerigour

No validation

Example Pedigree matrix parameter strength

Page 40: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Proxy Empirical Method Validation StrengthNS-SHI 3 3.5 4 0 0.66NS-B&S 3 3.5 4 0 0.66NS-DIY 2.5 3.5 4 3 0.81NS-CAR 3 3.5 4 3 0.84NS-IND 3 3.5 4 0.5 0.69Th%-SHI 2 1 2 0 0.31Th%-B&S 2 1 2 0 0.31Th%-DIY 1 1 2 0 0.25Th%-CAR 2 1 2 0 0.31Th%-IND 2 1 2 0 0.31VOS % import 1 2 1.5 0 0.28Attribution import 1 1 2 0 0.25

Example Pedigree results

Trafic-light analogy <1.4 red; 1.4-2.6 amber; >2.6 green

This example is the case of VOC emissions from paint in the Netherlands, calculated from national sales statistics (NS) in 5 sectors (Ship, Building & Steel, Do It Yourself, Car refinishing and Industry) and assumptions on additional thinner use (Th%) and a lump sum for imported paint and an assumption for its VOC percentage. See full research report on www.nusap.net for details.

Page 41: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Example: Air Quality

Page 42: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Similar to a patient information leaflet alerting the patient to risks and unsuitable uses of a medicine, NUSAP enables the delivery of policy-relevant quantitative information together with the essential warnings on its limitations and pitfalls. It thereby promotes responsible and effective use of science in policy processes.

Page 43: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Systematic reflection on uncertainty issues

in:• Problem framing• Involvement of stakeholders• Selection of indicators• Appraisal of knowledge base• Mapping and assessment of

relevant uncertainties• Reporting of uncertainty information

Page 44: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Mapping and prioritization of relevant uncertainties

• Highlight uncertainties in typology relevant to this problem

• Set priorities for uncertainty assessment• Select uncertainty assessment tools from

the tool catalogue

Page 45: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Typology of uncertainties• Location• Level of uncertainty

statistical uncertainty, scenario uncertainty, recognised ignorance

• Nature of uncertaintyknowledge-related uncertainty, variability-related uncertainty

• Qualification of knowledge base (Pedigree) weak, fair, strong

• Value-ladenness of choicessmall, medium, large

Page 46: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Locations of uncertainties:• Context

ecological, technological, economic, social and political

representation

• Expert judgementnarratives, storylines, advices

• Modelmodel structure, technical model, model parameters, model inputs

• Datameasurements, monitoring data, survey data

• Outputsindicators, statements

Page 47: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Page 48: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Tool catalogueFor each tool:• Brief description• Goals and use• What sorts and locations of uncertainty does this

tool address?• What resources are required to use it?• Strengths and limitations• guidance on application & complementarity • Typical pitfalls of each tool• References to handbooks, example case studies,

web-sites, experts etc.

Page 49: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Tool catalogue

• Sensitivity Analysis• Error propagation equations• Monte Carlo analysis• Expert Elicitation• Scenario analysis• NUSAP• PRIMA• Checklist model quality assistance• Assumption analysis• …...

Page 50: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Systematic reflection on uncertainty issues

in:• Problem framing• Involvement of stakeholders• Selection of indicators• Appraisal of knowledge base• Mapping and assessment of relevant

uncertainties• Reporting of uncertainty

information

Page 51: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

Reporting• Make uncertainties explicit• Assess robustness of results• Discuss implications of uncertainty

findings for different settings of burden of proof

• Relevance of results to the problem• Progressive disclosure of information ->

traceability and backing

Page 52: INTARESE Uncertainty Training 17-18 Oct 2007 Knowledge Quality Assessment an introduction

Copernicus Institute

Universiteit Utrecht

ConclusionsThe uncertainty guidance checklist:• Structures the tasks of uncertainty

management• Can be used flexibly

– Quick&dirty, Quick-scan, full-mode– Before/during/after

• Promotes reflection and forces deliberate choice on how uncertainties are handled

• Helps to avoid pitfalls• Its development and introduction at RIVM

constitutes an institutional innovation