Health Technologies

AI tool can instantly assesses self-harm and suicide risk

A new assessment tool that leverages powerful artificial intelligence was able to predict whether participants exhibited suicidal thoughts and behaviours using a quick and simple combination of variables.

Developed by researchers at Northwestern University, the University of Cincinnati (UC), Aristotle University of Thessaloniki and Massachusetts General Hospital/Harvard School of Medicine, the system focuses on a simple picture-ranking task along with a small set of contextual/demographic variables rather than extensive psychological data.

The tool was on average 92 per cent effective at predicting four variables related to suicidal thoughts and behaviours.

“A system that quantifies the judgment of reward and aversion provides a lens through which we may understand preference behaviour,” said first author Shamal Shashi Lalvani, a doctoral student at Northwestern University.

“By using interpretable variables describing human behaviour to predict suicidality, we open an avenue toward a more quantitative understanding of mental health and make connections to other disciplines such as behavioural economics.”

The study, published in the journal Nature Mental Health, concludes that a small set of behavioural and social measures play a key role in predicting suicidal thoughts and behaviours. The current work details the components of a tool that could be an app for medical professionals, hospitals or the military to provide assessment of who is most at risk of self-harm.

Hans Breiter, contact PI for the study, and a professor in computer science and biomedical engineering at UC, said: “It’s reported we have about 20 suicides daily among veterans in the U.S., and a salient number of students. We all can cite statistics to how the American medical system is at a breaking point. I wish we’d had this technology sooner. The data strongly argues it would change outcomes.

“People have developed good techniques with big data, but we have problems interpreting the meaning of many predictions based on big data. Having a small number of variables grounded in mathematical psychology appears to get around this issue and is needed if current machine learning is ever going to approach the issue of artificial general intelligence.”

Data was collected from surveys completed in 2021 by 4,019 participants ages 18 to 70 across the United States. Identities of participants were protected and not shared with researchers and participants gave informed consent.

Participants were asked to rank a random sequence of 48 pictures on a seven-point like-to-dislike scale of 3 to -3 in six categories: sports, disasters, cute animals, aggressive animals, nature and adults in bathing suits. Researchers also collected a limited set of demographics about age, sex assigned at birth, race or ethnicity, highest education level achieved and handedness.

“The usage of a picture-rating task may seem simple but understanding individual preferences and how one evaluates reward and aversion plays a large role in shaping personality and behavior,” said co-PI for the study and co-senior author Aggelos Katsaggelos, the Joseph Cummings Professor of Electrical and Computer Engineering at McCormick and director of the AI in Multimedia-Image and Video Processing Lab at Northwestern.

“We find that our results in predicting suicidality exceed typical methods of measurement without using extensive electronic health records or other forms of big data,” Katsaggelos said.

Along with the picture ratings, participants completed a limited set of mental health questions and were asked to rank perceived loneliness on a five-point scale.

When the data was plugged into an artificial intelligence system developed by Northwestern and University of Cincinnati, the software was able to predict four measures of suicidal thoughts and behaviours: passive suicidal ideation (desire without a plan); active ideation (current and specific thoughts); planning for suicide; and planning coping strategies to prevent self-harm.

Researchers noted that respondents in other countries could have unique cultural influences that might affect prediction success, although race and gender effect were the least predictive of any measures used.

Another potential limitation, the researchers said is the surveys were self-reported rather than through clinical assessments, adding that it’s difficult to see how a prospective study of suicide might be performed. Lastly, the cohort was sampled during the COVID-19 pandemic at a time that has seen higher-than-normal rates of loneliness and self-harm.

Avatar

admin

About Author

You may also like

Health Technologies

Accelerating Strategies Around Internet of Medical Things Devices

  • December 22, 2022
IoMT Device Integration with the Electronic Health Record Is Growing By their nature, IoMT devices are integrated into healthcare organizations’
Health Technologies

3 Health Tech Trends to Watch in 2023

Highmark Health also uses network access control technology to ensure computers are registered and allowed to join the network. The