SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Patterson ES, Roth EM, Woods DD. Cogn. Technol. Work 2001; 3(4): 224-237.

Copyright

(Copyright © 2001, Holtzbrinck Springer Nature Publishing Group)

DOI

10.1007/s10111-001-8004-y

PMID

unavailable

Abstract

Data overload is a condition where a practitioner, supported by artefacts and other practitioners, finds it extremely challenging to focus in on, assemble and synthesise the significant subset of data for the problem context into a coherent situation assessment, where the subset is a small portion of a vast data field. In order to predict vulnerabilities in intelligence analysis that might arise when traditional strategies for coping with data overload are undermined, we conducted an observational study in a simulated setting. Ten professional intelligence analysts analysed the causes and impacts of the Ariane 501 accident. When study participants performed a time-pressured analysis outside their base of expertise based on sampling reports from a large set, some made inaccurate statements in verbal briefings. Participants that made no inaccurate statements spent more time during the analysis, read more documents, and relied on higher-quality documents than participants who made inaccurate statements. All participants missed potentially available relevant information and had difficulty detecting and resolving data conflicts. Sources of inaccurate statements were: (1) relying upon default assumptions, (2) incorporating inaccurate information and (3) incorporating information that was considered accurate at one point in time. These findings have design implications and point to evaluation criteria for systems designed to address the data overload problem in intelligence analysis.


Language: en

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print