Weekly New/Digital Media (75)

Facebook will let users livestream self-harm, leaked documents show


self harm

https://www.theguardian.com/news/2017/may/21/facebook-users-livestream-self-harm-leaked-documents?page=with:img-2

Summary:
"Internal manual shows how site tries to strike balance between allowing cries for help and discouraging copycat behaviour" 

Facebook will allow users to livestream attempts to self-harm because it “doesn’t want to censor or punish people in distress who are attempting suicide”, according to leaked documentsHowever, the footage will be removed “once there’s no longer an opportunity to help the person” – unless the incident is particularly newsworthy.


Key facts/stats:


  • The Guardian has been told concern within Facebook about the way people are using the site has increased in the last six months.
  • A slide from Facebook’s guidance on self-harm and suicide.
  • (facebook guidance on self harm and suicide) 
  • For instance, moderators were recently told to “escalate” to senior managers any content related to 13 Reasons Why – a Netflix drama about the suicide of a high school student – because of fears it could inspire copycat behaviour.
  • Figures circulated to Facebook moderators appear to show that reports of potential self-harm on the site are rising. One document drafted last summer says moderators escalated 4,531 reports of self-harm in two weeks.
  • Sixty-three of these had to be dealt with by Facebook’s law enforcement response team – which liaises with police and other relevant authorities.
  • Figures for this year show 5,016 reports in one two-week period and 5,431 in another.
  • The documents show how Facebook will try to contact agencies to trigger a “welfare check” when it seems someone is attempting, or about to attempt, suicide
  • It says: “We’re now seeing more video content – including suicides – shared on Facebook. We don’t want to censor or punish people in distress who are attempting suicide. Experts have told us what’s best for these people’s safety is to let them livestream as long as they are engaging with viewers.
  • “However, because of the contagion risk [ie some people who see suicide are more likely to consider suicide], what’s best for the safety of people watching these videos is for us to remove them once there’s no longer an opportunity to help the person. We also need to consider newsworthiness, and there may be particular moments or public events that are part of a broader public conversation that warrant leaving up.”
  • Moderators have been told to “now delete all videos depicting suicide unless they are newsworthy, even when these videos are shared by someone other than the victim to raise awareness”.
  • Any threat to kill themselves more than five days in the future can also be ignored, the files say.
  • One of the documents says: “Removing self-harm content from the site may hinder users’ ability to get real-world help from their real-life communities.
  • She added: “In instances where someone posts about self-injury or suicide, we want to be sure that friends and family members can provide support and help.”


My opinion:
Social media has to understand the sheer impact and influence it can have on the mental health of users that it needs to take a much greater social responsibility in protecting its users. Before, i believe, it would have been users to prevent mental health issues with the main sources of information eg. tv and school, raised awareness , now it has to be everywhere, all over the internet because its having such a negative impact on users. Facebook cant just sit back like that. No matter what they say, they're just trying to save money and encouraging usage of the platform, even if its from stuff such as sharing self harm experiences.

Comments

Popular posts from this blog

Identities: applying feminism

Weekly New/Digital Media (11)

Weekly New/Digital Media (50)