Israeli researchers have developed AI-based technology they say could contribute to the development of a more effective screening tool for the early detection of suicidal tendencies and risks.
The technology, based on the automatic text analysis of social network content, was detailed in research published in the academic journal Scientific Reports last month. The research was led by Technion researchers Prof. Roi Reichart, Dr. Yaakov Ophir, and PhD candidate Refael Tikochinski with Hebrew University’s Prof. Christa Asterhan and Dr. Itay Sisso.
The researchers created a system that combines machine learning and natural language processing (NLP) algorithims with theoretical and analytical tools from the realm of psychology and psychiatry, and uses layered neural networks. The tools developed by the group can be used as an early detection tool for those in the population at-risk of suicidal thoughts or inclinations. It is not limited to people already being treated for mental health issues, a study from both universities said.
Researchers analyzed over 80,000 Facebook posts written by adults in the US and compared the language used with scores on their various psychological indices.
“The biggest problem with building such models is getting high-quality data,” Professor Roi Reichert tells NoCamels, “It’s like trying to predict anything. You have to have a good sample of user activity on social media, but you also have to have the psychological profile of each of the users. Naturally, when you have someone active on Facebook, for example, you want to know his or her psychological profile.”
Prof. Reichert explains that the researchers wanted to assess different aspects of their personality, psychological state, and psychiatric state. A total of 1,002 Facebook users completed a “well-established, clinically valid screening tool of suicide risk” and volunteered to disclose a year of their Facebook activity, according to the study.
“We used indicators for suicidal ideations and behaviors that were obtained from outside the participants’ social network activity (external indicators),” adds Prof. Christa Asterhan. “In other words, the algorithms were developed with and validated against external indicators of suicidal behavior and ideation,” she tells NoCamels. “The textual features of a person’s Facebook posts that the NLP-based algorithms picked up on proved to be significantly better predictors of suicide risk than a combination of known risk factors.”
SEE ALSO: 8 Social Initiatives Helping Israeli Cope With Coronavirus And Quarantine
Once they had the data, the team went on to develop the algorithm, Prof. Reichert said. “We built on some knowledge from the natural language processing world. Our algorithm is based on different, recently developed, deep learning algorithms that can extra information from the text very effectively,” he says.
The group conducted a word search for explicit suicide-related content among users at general risk by using more obvious variations of words such as suicide, kill, and die. The researchers discovered that people with real suicidal tendencies rarely use such “explicitly alarming” language.
“More often, they use negative descriptive words (”bad,” “worst”), curse words (“f***ing,” “b**ch”), expressions of emotional distress (“sad,” “hurt,” “cry,” “mad”), and descriptions of negative physiological states (“sick,” “pain,” “surgery,” “hospital”),” says Prof. Astehan. “People who do not have suicidal tendencies tend to express more positive emotions and experiences, and more references to religion and positive outlooks on life – a correlation that matches many studies that identified these factors as representing immunity to mental and emotional distress.”
Sign up for our free weekly newsletter
SubscribeThe team also made some unique observations and conclusions about groups of users. While Prof. Reichert says that he has read in research literature and popular newspapers that people who are more religious tend to be at lower risks for different diseases, the team also found through their study that “people that talk more about religion are less inclined to be at risk for suicide.”
Suicide prevention efforts contain at least two steps, Prof. Astehan explains. The first is to find those who suffer from suicidal thoughts and ideation and then treat them. However, as only a fraction of individuals at risk are identified through the regular channels, and seek professional help, one of the main challenges to combat suicide on time or early detection of individuals at risk.
“In this research project, we sought to explore novel ways of improving detection efforts, by focusing on a person’s everyday online behavior,” she says, “In other words, we sought to find digital footprints of suicide risk, whether these were left by individuals intently (that is explicitly phrased references to suicidal behavior or help-seeking) or without intent (nonexplicit patterns of online behavior that correlate with suicide risk).”
Suicide prevention in 2020
Suicide is a significant cause of death in Israel and around the world, with close to one million suicides worldwide annually, of which about 400 occur in Israel. Although it is not a leading cause of death among the general population, it is the number one cause for young people under the age of 24.
Mental health clinic and day program directors for Israeli children and adolescents reported a 71.2 percent increase in referrals of patients with serious suicidal thoughts during the second wave of the coronavirus pandemic, according to a report from the Mental Health division of the Health Ministry earlier this month published in The Jerusalem Post.
Dr. Udi Sasser, the director of the clinical department at the Mental Health Division, and Dr. Danny Budowski, director of outpatient services, wrote the report after they received responses at the end of October from 31 mental health directors, 23 of whom run clinics and eight of whom run day programs.
SEE ALSO: Will A Simple Blood Test Be Able To Predict A Person’s Suicidal Tendencies?
The report also revealed that almost 39 percent of patients reported some increase in suicidal thoughts, while another 32 percent said that they had had “a meaningful increase in suicidal thinking.”
“We believe that this knowledge contributes to the development of more effective detection tools for suicide risk. Such tools would ideally combine information from different sources and technologies, in addition to the textual cues,” Prof. Astehan tells NoCamels. “A prerequisite for improved prevention is detection/identification, but it is only the start of the process. The next step is to figure out how to approach, reach out, and offer support and assistance effectively, once an individual is identified by such a system.”
If you or someone you know needs help, please contact the ERAN (Emotional First Aid By Telephone & Internet) hotline by dialing 1201 from any Israeli phone or +972-76-8844400 from outside Israel. In the US, call the National Suicide Prevention Lifeline is at 1 800 273 TALK (8255).
Facebook comments