SAFETYLIT WEEKLY UPDATE

We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article

Citation

Lennon JC. Gen. Psychiatr. 2020; 33(6): e100269.

Copyright

(Copyright © 2020, BMJ Publishing Group)

DOI

10.1136/gpsych-2020-100269

PMID

33089067 PMCID

Abstract

Introduction

Machine learning (ML) techniques1 are becoming a major area of study in psychiatry, as ML possesses the theoretical capacity to draw conclusions based on a broad range of data once the system has been taught through a process of trial and error. Specifically, and arguably most importantly, ML can do so more quickly and accurately than clinicians.2 Several studies have demonstrated accuracy through the use of ML in various populations and geographic locations, further perpetuating the perceived need to incorporate ML into ongoing studies. However, many considerations that must be accounted for when discussing ML in the context of data collection and clinical implementation, many of which have done little to thwart the ongoing pursuit of this type of research.

Given the promises and overall potential of ML techniques, suicide is one global pandemic that could benefit greatly from its use due to failed attempts at prevention.3 However, how one defines and views suicide will determine his or her perceptions of ML's ability to be used globally in present times. Second, the limitations of ML in the context of suicide are critical to understand its utility in terms of both operation and timeliness. The use of ML techniques in suicide research is potentially premature, thus allocating funding to a solution to psychiatric translational issues prematurely. Torous and Walker4 reported that ML can serve as a practical tool to augment current knowledge and assist in overcoming translational issues. However, there are several considerations specific to suicide that are neither discussed nor given commensurate attention. While ML need not be considered a panacea to be perceived as fallacious, the underlying concern is that ML holds greater potential as a secondary measure to large-scale prospective studies than it does as a current psychiatric tool.

Defining suicide for machine learning

Views on suicide require a substantial paradigm shift,5 much like depression is in desperate need of reconsideration due to its biological and clinical heterogeneity.6 Based on current diagnostic criteria, suicide is not viewed as a distinct disorder or trajectory, despite vast literature supporting differences between those who are depressed, those who ideate and those who die by suicide.7 Instead, suicide is viewed as a cause of death--the result of brain dysfunction that may or may not have included depression. If this is the initial premise of one's syllogism, initial and ongoing conditions will hold less value than determining an ultimate outcome through...


Language: en

Keywords

suicide; risk assessment; psychiatry; models; research design; statistical

NEW SEARCH


All SafetyLit records are available for automatic download to Zotero & Mendeley
Print