Artificial Intelligence (AI) has totally brought new heights in the world, helping the mankind to address problems.
This mind-reading AI can detect suicidal thoughts, a huge help to address this increasing case.
The new research that suggests machine learning can identify those at risk of taking their own lives, a great and vital tool to combat the rampant suicide cases.
In fact, almost 800,000 people die by suicide every year, and unless they forewarn friends, family, or their therapist, those deaths are very difficult to predict.
But, researchers say biological signs do exist, buried in the hidden patterns of brain activity.
“Our latest work is unique insofar as it identifies concept alterations that are associated with suicidal ideation and behaviour,” explains psychologist Marcel Just from Carnegie Mellon University.
“This gives us a window into the brain and mind, shedding light on how suicidal individuals think about suicide and emotion related concepts,” Just added.
In previous research, Just and his team used computational models to map how the brain processes complex thoughts, whether that’s things like scientific concepts or the tangled combinations of ideas that represent human action.
Now, the team have utilized the same techniques to try and isolate what suicidal tendencies might look like in terms of our brain’s electrical activity, by searching for neural signatures that give away emotional responses such as sadness, shame, anger, and pride.
Also, the researchers recruited 34 young adults – 17 patients with suicidal tendencies (roughly half of whom had attempted suicide previously) along with 17 neurotypical controls – and had the participants undergo brain imaging with an fMRI machine.
Going in depth with the procedures, during the scans, the individuals were presented with 10 words related to suicide (such as ‘desperate’, ‘hopeless’, and ‘lifeless’) along with 10 positive words (such as ‘carefree’) and 10 negative words (such as ‘trouble’).
From the brain activity recorded and the emotional responses they indicated, the researchers isolated six terms – ‘death’, ‘cruelty’, ‘trouble’, ‘carefree’, ‘good’, and ‘praise’ – and five brain areas that most clearly distinguished the suicidal ideators from the control group.
Using this subset of the data, a machine-learning algorithm trained on the brain responses was able to correctly identify suicidal patients and controls 91 percent of the time: recognizing 15 of 17 patients as belonging to the suicide group, and 16 of 17 healthy individuals as belonging to the control group.
Meanwhile, in a separate experiment, where the algorithm was trained up exclusively on the 17 participants from the suicidal ideation group, the software was able to distinguish between patients who had previously attempted suicide and those who had not, getting it right in 94 percent of cases.
The team noticed that in particular the brain responses to the terms ‘death’, ‘lifeless’, and ‘carefree’ were the most accurate.
According to David Brent, senior researcher from the University of Pittsburgh, the research needs a further application in a larger data to validate the generality of the result.
“Further testing of this approach in a larger sample will determine its generality and its ability to predict future suicidal behavior, and could give clinicians in the future a way to identify, monitor and perhaps intervene with the altered and often distorted thinking that so often characterises seriously suicidal individuals,” says Brent.
But as to that claim of intervention applications, other experts have considerable doubts.
Aside from the tiny group of participants examined in the study – which the researchers acknowledge is a limitation of the study’s findings at this point – technological shortcomings with this kind of testing could prevent it from practically identifying those at risk of taking their own lives.
“[There are many challenges to routine use of their method in a healthcare setting,” says medical imaging researcher Derek Hill from University College London.
“The sort of functional brain scanning that the researchers employed is only available in advanced research institutions, and requires cooperative patients, so wouldn’t be widely available to mental health patients in the near future.”
As for the illustration of how suicidal thinking can be identified by discrete patterns of brain activity, commentators are more accepting – to a point.
“There is undoubtedly a biological basis for whether someone is going to commit suicide,” neuroscientist Blake Richards from the University of Toronto in Canada told The Verge.
“There’s a biological basis for every aspect of our mental lives, but the question is whether the biological basis for these things are sufficiently accessible by fMRI to really develop a reliable test that you could use in a clinical setting.”
The full findings are posted in Nature Human Behaviour.