Social media’s most popular network Facebook is currently tracking API’s to reduce the number of suicides.
The social media giants are testing tools to see if they can help track suicidal users by using artificial intelligence to identify posts they hope will make it easier for other people to report them.
An article on newscientist.com said that “Facebook will use pattern algorithms to spot posts that could indicate someone is suicidal and help their friends to flag this content by making the option to report posts about “suicide and self-injury”.
The algorithms are trained on posts that have previously been reported.”
It added that “Facebook will also use pattern recognition to flag posts “very likely to include thoughts of suicide” so that its community operations team can take action even if the post is not reported.”
Why don’t I think this will work?
I don’t think that Facebook API will help reduce the number of people killing themselves because some people who write these posts will be challenging to track down.
People whose comments sound suicidal might be picked up when they are exaggerated.
The new initiative aims to pick up on people posting videos to Facebook and encourage friends of the suicidal person to refer them to helplines.
Many people suffering from mental health issues are in denial, so even if their friends try to get them help, there’s a chance that the suicidal person can reject the offer.
The system is currently being tested on a small number of people in the US.
What are your thoughts on facebooks API helping to reduce suicide rates?
Comment below or join in the discussions on our Twitter, Facebook or Instagram.