Last Wednesday, Facebook announced that it would extend its array of anti-suicide tools to users all over the world.

Now Facebook users can flag people that they deem at risk of harming themselves, which may eventually result in them taking their lives.

The tools were first tried in the U.S., where users were prompted to highlight posts that referred to suicidal thoughts or to plans to inflict self-harm. The warnings’ risk factor is then analyzed by a specialized team, which would act accordingly to the particular situation.

Last Wednesday, Facebook announced that it would extend its array of anti-suicide tools to users all over the world. Photo credit: The Verge
Last Wednesday, Facebook announced that it would extend its array of anti-suicide tools to users all over the world. Photo credit: The Verge

Depressing posts are not uncommon

Many people use social media websites to vent their pressure and to find help perhaps. Facebook has become the first of many websites to realize this and to propose a solution that could alleviate their user’s pain by allowing friends and acquaintances to flag posts that may reveal an intent of self-harm.

The monitoring team is reportedly working 24 hours a day, every day of the week. At first, Facebook tried to manipulate their user’s feed to see whether they preferred posts highlighted as positive or those highlighted as negative, as expressed by the emoticon on the lower part of the post. It seems that Facebook places a heavier emphasis on happier messages, especially those where users congratulate each other due to birthdays, weddings, graduations and such.

Negative posts appeared to be more frequently filled with long comments, which would highlight a need for communication with the poster. The initial message revealing their distress would be the first user sending a depressing or concerning post to all of its friends’ feeds.

The data behind Facebook’s suicide prevention tools

The rates of suicide in the United States are in an alarming 30-year high, matching the ongoing opioid drug crisis. The most frequent victims of suicide are women and Americans in their mid-30’s. It was only last year that President Barack Obama declared September 20 as the World Suicide Prevention Day in an effort for people to keep an eye on mental health and depression of their friends and family.

Facebook is viewed as having a significant role in suicide prevention and overall social and psychological influence. Besides using post feelings to research user response towards specific types od content, Facebook was also accused of promoting political bias. This is nothing to be surprised of, as it would be astoundingly easy to modify the Facebook feed algorithm so, for example, a presidential candidate, gets benefited by getting more exposition in social media instead of another candidate.

Data collected by Facebook revealed that at least a third of every post contained negative feelings. Because Facebook often connects friends, family and coworkers rather than adversaries, it seems to be a natural stance for the company to promote the well-being of its users by allowing them to highlight their contacts as likely to harm themselves.

Facebook now allows several actions for the reporting person. One can send a direct message to the afflicted friend, or coordinate actions with a mutual friend. Text messages can also be sent, and users can customize them. Now users must expect not to be bothered or abused by the use of this tool made out of the good will and of being concerned for other people.

Source: NY Times