Today, when people mention mental health and Facebook in the same sentence, it almost never bears a positive connotation.
But at least for one young woman in Lucknow, India, there’s a very good reason to make such a connection, as an AI working for Meta alerted local police that she was about to commit suicide.
The 23-year-old was distraught over what Indian media claims was an abandonment by her husband, and uploaded a video Sunday night on Facebook akin to a suicide letter. A noose hung ominously in the background, fastened around her neck as she spoke.
While the video was going viral, the Social Media Center of the office of the Directorate General of Lucknow Police received an automated alert from Meta AI that included the rough location of her phone.
A female police officer closest to the woman’s village was alerted, and she arrived before the deed was done. The officer spoke to the woman who eventually removed the noose. Declining to take her into protective custody, the Print reports that the police remain in constant contact.
The victim was engaged in a love affair with a young man from another village. The two eloped but declined to form a legal union. He eventually ended their pseudo-marriage, leaving her in despair. The man has been arrested and is undergoing questioning.
Using artificial intelligence to scan social media for suicidal intent is done by more than just Meta—there are whole companies developing AIs for this purpose. One, Sentinet, says that its algorithms flag 400 posts on social media every day that display an actionable level of suicidal intent.
Meta leverages this in their AI division to keep track of these posts, which when identified can lead to several actions—one of which is the direct message.
“We stumbled upon your post…and it looks like you are going through some challenging times,” one such message begins from Samurai Labs, another AI firm trying to combat suicide through social media observation.
YOU MIGHT ALSO ENJOY: Florida Man Deploys ‘Subliminal’ Advertising to Incite Happiness Through Viral Sign-Hanging Campaign
“We are here to share with you materials and resources that might bring you some comfort.” A list of resources and stories of people overcoming suicidal thoughts follow, with the message concluding with a virtual hug.
According to Time Magazine, a human supervisor looks at posts flagged by Samurai Labs’ AI and decides whether the user should be messaged with instructions about how to get help. About 10% of people who received these messages contacted a suicide helpline, and the company’s representatives worked with first responders to complete four in-person rescues like the one in Lucknow.
MORE POSITIVE AI STORIES: Korea is Using Artificial Intelligence to Prevent Suicide Attempts on Bridges
We often see discussions about AI’s disruptive power in society, up to and including how it might affect human social interaction. The story is a nice reminder that in many cases AI is simply a tool, and it can be used for good just as well as for any other purpose.
SHARE This Positive Use Of AI With Your Friends Worried About Its Impact…