SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Friedman JA, Lerner JS, Zeckhauser R. Int. Organ. 2017; 71(4): 803-826.

Copyright

(Copyright © 2017, Cambridge University Press)

DOI

10.1017/S0020818317000352

PMID

unavailable

Abstract

National security is one of many fields where experts make vague probability assessments when evaluating high-stakes decisions. This practice has always been controversial, and it is often justified on the grounds that making probability assessments too precise could bias analysts or decision makers. Yet these claims have rarely been submitted to rigorous testing. In this paper, we specify behavioral concerns about probabilistic precision into falsifiable hypotheses which we evaluate through survey experiments involving national security professionals. Contrary to conventional wisdom, we find that decision makers responding to quantitative probability assessments are less willing to support risky actions and more receptive to gathering additional information. Yet we also find that when respondents estimate probabilities themselves, quantification magnifies overconfidence, particularly among low-performing assessors. These results hone wide-ranging concerns about probabilistic precision into a specific and previously undocumented bias that training may be able to correct.

COPYRIGHT: © The IO Foundation 2017


Language: en

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print