Will Facebooks API help reduce suicide rates? as it is currently tracking API’s in a bit to stop people from killing themselves.
The social media giants are testing tools to see if they can help track suicidal users by using artificial intelligence to identify concerning posts which they hope will make it easier for other people to report them.
An article on newscientist.com said that “Facebook will use pattern algorithms to spot posts that could indicate someone is suicidal and help their friends to flag this content by making the option to report posts about “suicide and self-injury” more prominent for those that are considered potentially concerning. The algorithms are trained on posts that have previously been reported.”
It added that “Facebook will also use pattern recognition to flag posts “very likely to include thoughts of suicide” so that its community operations team can take action even if the post is not reported.”
Why don’t I think this will work?
I don’t think that Facebooks API will help to reduce the number of people killing themselves because some people who write these posts will be difficult to track down and also people whose comments sound suicidal might be picked up when they are exaggerated.
The new initiative aims to pick up on people who are also posting videos to Facebook and encourage friends of the suicidal person to refer them to helplines.
Lots of people who are suffering from mental health issues are in denial so even if their friends try to get them help, there’s a chance that the suicidal person can reject the offer.
The system is currently being tested on a small number of people in the US.