Weekly New/Digital Media (63)

Facebook killing video puts moderation policies under the microscope, again

One expert asked, ‘at what point do we transfer some of the responsibility for these acts to the platform?’

https://www.theguardian.com/technology/2017/apr/17/facebook-live-murder-crime-policy

Summary: "Question of the social network’s role in amplifying crime has intensified after it took several hours to remove a brutal video seen millions of times"

The shooting of Grandfather Robert Godwin in a first person view of the shooting was uploaded to Facebook (remaing there for 2 hours before being taken down) by that time it had been saved, and shared and viewed millions of times. This therefore caused a lot of distress to his grandson who took to twitter to ask for everyone to stop sharing the video to show some respect to his grandfather.

This therefore brought into question again social medias ability to moderate content especially when it is the unfolding of a crime.
"The incident comes on the eve of Facebook’s F8, an annual event for developers, and at a time when the company is working hard to promote its role as an enabler of civic engagement. Two months ago, CEO Mark Zuckerberg penned a 5,700-word manifesto outlining measures the social network was taking to address several challenges faced by humanity." 

The letter explained how Artifical intelligence looked at the photos and flagged content.Last month a 15-year-old girl was raped by multiple people in Chicago, an attack that was streamed on Facebook Live. Last year 23-year-old Korryn Gaines used Facebook to broadcast a standoff with police in Baltimore, which ended in the mother of one being shot and killed. Facebook has also hosted videos showing the torture of a young man with disabilities in Chicago, the musings of a spree killer being chased by policechild abuse and now murder.

In terms of the audience response, The attention from online peers, combined with immediate feedback in the form of comments, reactions and shares, can be intoxicating. The fact that the footage is self-incriminating doesn’t matter to some offenders.
Causing the response from Facebook on a blog post. “As a result of this terrible series of events, we are reviewing our reporting flows to be sure people can report videos and other material that violates our standards as easily and quickly as possible,” said Justin Osofsky, vice-president of operations. “We disabled the suspect’s account within 23 minutes of receiving the first report about the murder video, and two hours after receiving a report of any kind. But we know we need to do better.”

Key facts/statistics:

  • “This is a horrific crime and we do not allow this kind of content on Facebook.”
  • “This is still very early in development, but we have started to have it look at some content, and it already generates about one-third of all reports to the team that reviews content for our community,” he said.
  • “There have been beatings, rapes, suicide … other incidents seemed to be building to this,” said Sarah T Roberts
  • Terrorists, protestors and narcissistic criminals have always used the media to ensure that “performance” crimes make maximum impact. 
  • What’s different now is the access people have to tools – via their smartphone – to create, publish and distribute content at the touch of the button. Committing a crime for an audience has never been easier.
  • “Social media removes the gatekeepers between performance and distribution,” said Raymond Surette
  • “Being famous for being a bad person is more acceptable for some people than being an unknown good person,” said Surette, adding that if the 9/11 terrorists “had the capability to live-stream their hijackings and plane explosions they would have done it”.
  • Surette doesn’t think there’s much Facebook can do to prevent footage of these crimes from being uploaded (“If you get an obscene phone call you don’t blame the phone company,” he said) but does believe the company has a responsibility to take videos and live streams down as quickly as possible.
  • “The less time it’s up there, the less likely it’s going to generate a copycat,” he said.
  • Beyond Facebook’s responsibility, the fact that footage of the murder has been viewed so widely – with one version of the video seen more than 1.6m times – highlights an ugly side of human nature
  • “It’s the same reason people slow down to watch a car crash,” Surette said. “The dark side is an attraction for everybody.”
  • “The way this material is often interrupted is because someone like you or me encounters it,” Roberts said. “This means a whole bunch of people saw it and flagged it, contributing their own labour and non-consensual exposure to something horrendous. How are we going to deal with community members who may have seen that and are traumatized today?”


My opinion:

I think there are much bigger implications to the issue than just censorship and control. As Robert says. its the ugly side of human nature. Social media provides an audience with a service and tools that is now putting increasing power into the hands of users of how they will use that service, and some will defiently exploit this. This therefore expresses the issues of providing audiences with a lot of power, not all users will be 'friendly' users.

I do however believe that a response needs to be given a lot faster, the first flag should shift the photo/video onto a human regulator so they can make an ethical judgment about the content. The reponse is so slow that it suggests that individual audience members aren't valued by the institution despite the fact that it is these users producing and consuming content so they are a lot more important than facebook are taking them. They need to wake up.

Comments

Popular posts from this blog

Identities: applying feminism

Weekly New/Digital Media (62)

Weekly New/Digital Media (9)