Loading presentation...

Present Remotely

Send the link below via email or IM

Copy

Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.

DeleteCancel

expert elicitation

No description
by

C. M. Helgeson

on 11 December 2014

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of expert elicitation

psychology
business management
education
counseling
cognitive science
linguistics
philosophy
knowledge engineering
anthropology
Format:
interview, passive observation, questionnaires; interview aids: diagrams, card-sorting, cross-examination ...

Data collected:
verbal responses, response time, eye-movement ...

Structure of knowledge:
flow charts, concept hierarchies, semantic maps, causal diagrams, numerical estimates, probabilities, utilities ...


Recurring considerations

Fit between type of knowledge, elicitation format, and knowledge representation
Time and expense
Tractability of elicitation output
Need for interviewer expertise
Shift from mining to construction metaphors
knowledge
elicitation
knowledge
acquisition
training (medical, legal, aviation ...)
knowledge-based systems
job analysis
safety and regulatory decisions
industrial engineering
military planning
education research
Nuclear safety
Cooke, R. M. (
2013
). Uncertainty analysis comes to integrated assessment models for climate change. . . and conversely.
Climatic change
117 (3), 467–479
Garthwaite, P. H., J. B. Kadane, and A. O’Hagan (
2005
). Statistical methods for eliciting probability distributions.
Journal of the American Statistical Association
100(470), 680–701
Horton, B. P., S. Rahmstorf, S. E. Engelhart, and A. C. Kemp (
2014
). Expert assessment of sea-level rise by ad 2100 and ad 2300.
Quaternary Science Reviews
84, 1–6
Bamber, J. L. and W. Aspinall (
2013
). An expert judgement assessment of future sea level rise from the ice sheets.
Nature Climate Change
3 (4), 424–427
McDaniels, T., T. Mills, R. Gregory, and D. Ohlson (
2012
). Using expert judgments to explore robust alternatives for forest management under climate change.
Risk Analysis
32(12), 2098–2112
Zickfeld, K., M. G. Morgan, D. J. Frame, and D. W. Keith (
2010
). Expert judgments about transient climate response to alternative future trajectories of radiative forcing.
Proceedings of the National Academy of Sciences
107(28), 12451–12456
Kriegler, E., J. W. Hall, H. Held, R. Dawson, and H. J. Schellnhuber (
2009
). Imprecise probability assessment of tipping points in the climate system.
Proceedings of the National Academy of Sciences
106 (13), 5041–5046
Lenton, T. M., H. Held, E. Kriegler, J. W. Hall, W. Lucht, S. Rahm- storf, and H. J. Schellnhuber (
2008
). Tipping elements in the earth’s climate system.
Proceedings of the National Academy of Sciences
105 (6), 1786–1793
Zickfeld, K., A. Levermann, M. G. Morgan, T. Kuhlbrodt, S. Rahm- storf, and D. W. Keith (
2007
). Expert judgements on the response of the atlantic meridional overturning circulation to climate change.
Climatic Change
82(3-4), 235–265
Morgan, M. G., P. J. Adams, and D. W. Keith (
2006
). Elicitation of expert judgments of aerosol forcing.
Climatic Change
75(1-2), 195–214
Vaughan, D. G. and J. R. Spouge (
2002
). Risk estimation of collapse of the west antarctic ice sheet.
Climatic Change
52(1-2), 65–91
Morgan, M. G., L. F. Pitelka, and E. Shevliakova (
2001
). Elicitation of expert judgments of climate change impacts on forest ecosystems.
Climatic Change
49(3), 279–307
Titus, J. G. and V. Narayanan (
1996
). The risk of sea level rise: A delphic monte carlo analysis in which twenty researchers specify subjective probability distributions for model coefficients within their respective areas of expertise.
Climatic Change
33(2), 151–212
Morgan, M. G. and D. W. Keith (
1995
). Subjective judgments by climate experts.
Environmental Science & Technology
29(10), 468A–476A
Morgan, M. G. (
2014
). Use (and abuse) of expert elicitation in support of decision making for public policy.
Proceedings of the National Academy of Sciences
111(20), 7176–7184

Shapiro, H. T., R. Diab, C. de Brito Cruz, M. Cropper, J. Fang, L. Fresco, S. Manabe, G. Mehta, M. Molina, P. Williams, et al. (
2010
). Climate change assessments: Review of the processes and procedures of the IPCC.
Technical report, InterAcademy Council
, Amsterdam

Oppenheimer, M., B. C. O’Neill, M. Webster, and S. Agrawala (
2007
, September). The limits of consensus.
Science Magazine’s State of the Planet 2008-2009: with a Special Section on Energy and Sustainability
317(5844), 1505–1506

Reilly, J., P. H. Stone, C. E. Forest, M. D. Webster, H. D. Jacoby, and R. G. Prinn (
2001
). Uncertainty and climate change assessments.
Science
293(5529), 430–433
Fig.3 in: Speirs-Bridge, A., F. Fidler, M. McBride, L. Flander, G. Cumming, and M. Burgman (
2010
). Reducing overconfidence in the interval judgments of experts.
Risk Analysis
30(3), 512–523
Several decades of research says overconfidence:

increases with availability of information
increases with difficulty of the question
increases without feedback
can be influenced by cognitive style
some evidence that women are better

More recent literature has focused on
question format
, and specifically on
interval elicitation, e.g.:

Soll, J. B. and J. Klayman (
2004
). Overconfidence in interval estimates.
Journal of Experimental Psychology: Learning, Memory, and Cognition
30(2), 299.
Teigen, K. H. and M. Jørgensen (
2005
). When 90% confidence intervals are 50% certain: On the credibility of credible intervals.
Applied Cognitive Psychology
19(4), 455–475.
Speirs-Bridge, A., F. Fidler, M. McBride, L. Flander, G. Cumming, and M. Burgman (
2010
). Reducing overconfidence in the interval judgments of experts.
Risk Analysis
30(3), 512–523.
*
from Goossens et al. (
2008
)
Cooke, R. M. (
2014
). Deep and shallow uncertainty in messaging climate change. In R. Steenbergen, P. van Gelder, S. Miraglia, and A. Vrouwenvelder (Eds.),
Safety, Reliability and Risk Analysis: Beyond the Horizon
, pp. 13–25. CRC Press

"Following the introduction of expert systems in the 1970s, the artificial intelligence community experienced an explosion of “alternative representations of uncertainty” through the 1980’s, including certainty factors, degrees of possibility, fuzzy sets, belief functions, random sets, imprecise probabilities, non-monotonic logic, imprecise probabilities, among many others."

Imprecise probabilty
too imprecise
, leads to unrealistic ranges when propagated through analysis

“Shallow uncertainty is uncertainty resulting from undefined terms, careless formulation, lack of operational definitions and overall intellectual sloth.” Social Discount Rates are an
example: if you don’t know what it means, then you can’t be uncertain about it.

Talk of deep uncertainty in climate science “seems to have morphed into apology for not quantifying uncertainties.”

“However useful the alternative representations of uncertainty may be, they should not crowd out active research within the probabilistic approach.”

Among interesting future developments: (1) “high dimensional dependence modelling” (coming from finance) and (2) training scientists to be better at giving probabilities.
Cooke, R. M. (
1991
).
Experts in uncertainty: opinion and subjective probability in science
.

Cooke, R. and L. Goossens (
2000
).
Procedures guide for structured expert judgment. project report, European Commission Nuclear Science and Technology
.

Goossens, L., R. Cooke, A. Hale, and L. Rodíc-Wiersma (
2008
). Fifteen years of expert judgement at TUDelft.
Safety Science
46(2), 234–244.

from Cooke (
2014
)
AR5 Uncertainty Guidance Note:
'When appropriate, consider using formal elicitation methods to organize and quantify these judgments.'

Did any chapter authors take up the offer?


Cooke, N. J. (
1994
). Varieties of knowledge elicitation techniques.
International Journal of Human-Computer Studies
41(6), 801–849.
Morgan, M., M. Henrion, and M. Small (
1990
).
Uncertainty: a guide to dealing with uncertainty in quantitative risk and policy analysis.

Morgan (lead author), M. G., H. Dowlatabadi, M. Henrion, D. Keith, R. Lempert, S. McBride, M. Small, and T. Wilbanks (contributing authors) (
2009
).
Best practice approaches for characterizing, communicating and incorporating scientific uncertainty in climate decision making.
A report by the climate change science program and the subcommittee on global change research, National Oceanic and Atmospheric Administration.

anchoring
availability
overconfidence
Budnitz, R. J., G. Apostolakis, D. M. Boore, L. S. Cluff, K. J. Coppersmith, C. A. Cornell, and P. A. Morris (
1998
). Use of technical expert panels: Applications to probabilistic seismic hazard analysis*.
Risk Analysis
18(4), 463–469.
Budnitz, R., G. Apostolakis, D. Boore, S. Cluff, J. Coppersmith, and A. Morris (
1997
).
Recommendations for probabilistic seismic hazard analysis: Guidance on uncertainty and use of experts.
Technical Report NUREG/CR-6372, Lauwrence Livermore National Laboratory.
Kaplan, S. (
1992
). ‘expert information’versus ‘expert opinions’. another approach to the problem of eliciting/combining/using expert knowledge in PRA.
Reliability Engineering & System Safety
35(1), 61–72.
Keeney, R. L. and D. Von Winterfeldt (
1989
). On the uses of expert judgment on complex technical problems.
Engineering Management, IEEE Transactions on
36(2), 83–86.
Post-DELPHI developments in deliberation:
Heuristics and biases
"Whether the elicitation is done to obtain a prior distribution for some Bayesian analysis, obtain expert judgments for inputs of some decision model, or for some other purpose, sensitivity analysis has the same objective. It is to determine whether, when the elicited distribution is varied to other distributions that might also be consistent with the expert's knowledge, the results derived from that distribution change appreciably. If not, then the elicitation has adequately represented the experts knowledge." (Garthwaite et al. 2005)
Picking the variables/events/parameters
Use well-defined events
In Central and Eastern Europe, summer precipitation is projected to decrease, causing higher water stress. Health risks due to heatwaves are projected to increase. Forest productivity is expected to decline and the frequency of peatland fires to increase.
(High confidence; IPCC, 2007b, p. 14)
Nearly all European regions are anticipated to be negatively affected by some future impacts of climate change, and these will pose challenges to many economic sectors. (Very high confidence; IPCC, 2007b, p. 14)
Clairvoyant test
Cooke's alternative: 'Every term in a model must have operational meaning, that is, the modeler should say how, with sufficient means and license, the term would be measured'
Be explicit about what is to be conditioned upon, and what is to be 'integrated over'
Aggregate, deliberate, neither?
From last week:
Genest, C. and J. V. Zidek (
1986
). Combining probability distributions: A critique and an annotated bibliography.
Statistical Science
, 114–135.

Additional option:
don't do anything; report the diversity of opinion
Testing the adequacy of elicited probabilities
"There is one simple conclusion: uncertainty quantification of the consequences of climate change should be resourced at levels at least comparable to major projects in the area of nuclear safety."


"Compared to the resources available for nuclear safety, the uncertainty analyses of integrated assessment models are a cottage industry. All the greater is our debt to those who have pursued this work. The social priorities expressed by this fact merit sober reflection."


The fourth generation in risk assessment & expert elicitation should address
model uncertainty
Three 'generations' of risk assessment & expert elicitation methodology,

First and third associated with particularly big investments in nuclear reactor risk assessment:

Reactor Safety Study
of 1975 (70 person-years and four million 1975 dollars)

The US-European Joint Study
program 1990–2000 ($7 million budget included $15,000 compensation to each of 69 experts)
Choice 1: Urn of known composition: 50/50
red
/
blue
A: Win $50 if
red
is drawn (and zero otherwise).
B: Win $50 if
blue
is drawn (and zero otherwise).

Choice 2: Urn of unknown composition
red
/
blue
C: Win $50 if
red
is drawn (and zero otherwise).
B: Win $50 if
blue
is drawn (and zero otherwise).

Choice 3
A: Win $50 if
red
is drawn from the urn of
known composition
(and zero otherwise).
C: Win $50 if
blue
is drawn from the urn of
unknown composition
(and zero otherwise).

Choice 1: Urn of known composition: 50/50
red
/
blue
A: Win $50 if
red
is drawn (and zero otherwise).
B: Win $50 if
blue
is drawn (and zero otherwise).

Choice 2: The subjective distribution you just gave
C: Win $50 if S > S0.5 (and zero otherwise).
D: Win $50 if S < S0.5 (and zero otherwise).

Choice 3
A: Win $50 if
red
is drawn from the
known composition
urn (and zero otherwise).
C: Win $50 if S > S0.5 (and zero otherwise).
Comments

Compare Cooke's qualitative option, and the Garthwaite et al (2005) approach

Next steps
Cooke, R. M. (
2013
). Uncertainty analysis comes to integrated assessment models for climate change. . . and conversely.
Climatic change
117 (3), 467–479
Morgan, M. G. (
2014
). Use (and abuse) of expert elicitation in support of decision making for public policy.
Proceedings of the National Academy of Sciences
111(20), 7176–7184

"Done well, expert elicitation can make a valuable contribution to informed decision making. Done poorly it can lead to useless or even misleading results that lead decision makers astray, alienate experts, and wrongly discredit the entire approach."

Worries that Aspinall (2010) overplays anonymity (perhaps leading participants to take it less seriously), and encourages ‘quick and dirty’ elicitations by talking up how quickly an elicitation can be done.

Skeptical of elicitations without a trained interviewer.

It is tempting to want to combine the judgments of multiple experts to obtain the answer. Sometimes this makes sense. However, if different experts base their judgments on very different models of the way the world works, or if they produce quite different judgments that will be used as the input to a non-linear model, then combining judgments does not make sense.
Millner, A., R. Calel, D. A. Stainforth, and G. MacKerron (
2013
). Do probabilistic expert elicitations capture scientists’ uncertainty about climate change?
Climatic change
116(2), 427–436
1. Climate services & Expert elicitation





2. Questions
relevant seed questions?
Why different: IPCC uncertainty framework and standard elicitation formalism (probability)?
`known inadequacies of best model' and `known inadequacies of all models' (Spiegelhalter, D. J. and H. Riesch 2011).
`Relevant Dominant Uncertainty' (Smith and Petersen 2014) (Morgan et al. 2001, Morgan 2006)
likely impact on uncertainty from further research, e.g. How might 15 years of well-funded research change (esp. narrow) your pdf? (Zickfield et al 207; Morgan et al. 2006)
Full transcript