This Mind-Reading AI Could Prevent Suicide

Wednesday, 08 November 2017 - 11:45AM
Medical Tech
Wednesday, 08 November 2017 - 11:45AM
This Mind-Reading AI Could Prevent Suicide
Image credit: Wikimedia Commons

Could AI be the psychiatrist of the future?

Our brains are complex systems, relying on different chemicals and fuel sources in order to stay active and well-balanced. In a continued effort to find ways to better diagnose mental illnesses, researchers have devised a deep-learning AI that can tell what a person is thinking by scanning their brain, and can spot suicidal thoughts—at least in theory.

The AI can't yet actually tell exactly what a person is thinking, but it is able to spot trends and commonalities between mental health patients who suffer from depression and thoughts of suicide. From that, the AI can see what parts of a patient's brain are firing regularly while they're being scanned, and can make an educated guess about whether or not that person is contemplating ending their life.

The AI has only been used in a single clinical trial, and the pool of volunteers was far too small for any of this to be taken at face value. At present, this is just an initial study, hinting at the benefits of further research.

Researchers took a small sample of just 17 participants with mental health issues—all of whom had previously attempted suicide—and mixed them in with another 17 healthy participants with no history of suicidal thoughts.

Each participant's brain was scanned by an fMRI machine, during which the participants were presented with a series of positive and negative words (such as "desperate," "carefree," "hopeless," and "trouble"), and a note was made of how their brains responded to the stimuli.




Ultimately, five words seemed the most effective at inducing a notable response from the participants' brains, and the scans for when the participants were confronted with the words "death," "cruelty," "trouble," "carefree," "good," and "praise" all fed into an AI that had been built to detect suicidal thoughts.

The results were promising.

The AI managed to correctly identify and sort the participants based on their suicidal thoughts 91 percent of the time, only getting mixed up with the results from two participants. If this study can be built on and developed, it may be possible to tell what a person's emotional and mental state might be simply by scanning their brains.

Of course, there are problems with this method of diagnosis. It involves convincing someone to willingly climb into an fMRI machine, which can be difficult when dealing with people who, for legitimate reasons, don't want doctors reading their thoughts.

Regular use of this AI would constitute a gross breach of ethics, especially when put to work on diagnosing people who want to keep their own thoughts private. There's also questions as to who should be able to gain access to people's thoughts using this technique - should employers, for example, be allowed to force potential new employees to get a brain scan in order to weed out any prospective new hires that might be likely to make things difficult for the company with their pesky mental illnesses?

There's a lot to consider here, and research will no doubt continue to explore just how this technology can be used in future - and whether or not it'll actually be of any use.

After all, what's the point of only being able to tell if someone is suicidal if they're willing to participate in a mind-reading experiment? This AI is only actually useful in catching people who are lying about not having suicidal thoughts, and in these cases, the patients probably deserve the right to be able to keep their thoughts private.

For cooperative patients, there's a much easier way to find out whether they're suicidal. You can simply ask them.

Science
Technology
Medical Tech
This Mind-Reading AI Could Prevent Suicide
No