SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Gomes de Andrade NN, Pawson D, Muriello D, Donahue L, Guadagno J. Philos. Technol. 2018; 31(4): 669-684.

Copyright

(Copyright © 2018, Holtzbrinck Springer Nature Publishing Group)

DOI

10.1007/s13347-018-0336-0

PMID

unavailable

Abstract

There is a death by suicide in the world every 40 seconds, and suicide is the second leading cause of death for 15-29-year-olds. Experts say that one of the best ways to prevent suicide is for those in distress to hear from people who care about them. Facebook is in a unique position--through its support for networks and friendships on the site--to help connect a person in these difficult situations with people who can support them. Connecting people with the resources they need is part of Facebook's ongoing efforts to help build a safe community inside and outside of Facebook. This article provides a brief overview of how Facebook's work to develop suicide prevention tools started and evolved, and the ethical considerations which surfaced during the process in the form of concrete product decisions around the implementation of these tools. This article is structured into three sections. Section 1 reviews what has been done in this space and lists and briefly describes other suicide prevention apps and tools. Section 2 describes Facebook's overall approach to suicide prevention. Here, we'll delve first into how that approach originated and how it was influenced by the external community's proactive interactions with Facebook, highlighting our unique position to help address the problem. Afterwards, we'll explain how that approach evolved, describing its various stages and iterations: understanding, reactive reporting, queue prioritization, and proactive reporting. This section describes the tools and resources Facebook has developed for people who may be at risk. Particular attention is devoted to the use of ArtificiaI Intelligence (AI) and Machine Learning (ML) to detect posts or live videos where someone might be expressing thoughts of suicide. Section 3 will elaborate on the ethical questions addressed when developing our approach and when making concrete product decisions to implement our suicide prevention tools. In this last section, we'll expound the competing values and interests that were at stake during the product development process, and how we reached ethical balances between them. © 2018, The Author(s).


Language: en

Keywords

Suicide; Ethics; Facebook; AI; ArtificiaI Intelligence; Machine Learning

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print