As the social network grows and adds new features such as live streaming and Snapchat-inspired filters, Facebook has found itself dealing with a few issues that it isn’t really able to control.
Imagine having to moderate two billion people from across the globe, all with different ideas about the world. Sounds exhausting.
Various instances, such as the live streaming of violence and reported hate speech, have led to countries putting pressure on the network to just do better with its moderating.
Now leaked documents show what contingency plans Facebook have implemented with regards to such instances, with the most controversial “guideline” being how Facebook will deal with live-streamed suicides.
The Guardian explains:
Facebook will allow users to live stream attempts to self-harm because it “doesn’t want to censor or punish people in distress who are attempting suicide”, according to leaked documents.
However, the footage will be removed “once there’s no longer an opportunity to help the person” – unless the incident is particularly newsworthy.
The policy was formulated on the advice of experts, the files say, and it reflects how the social media company is trying to deal with some of the most disturbing content on the site.
The documents show how Facebook will try to contact agencies to trigger a “welfare check” when it seems someone is attempting, or about to attempt, suicide.
Looking at the stats, it’s clear why this has become a major concern for the site – especially over the last six months:
Figures circulated to Facebook moderators appear to show that reports of potential self-harm on the site are rising. One document drafted last summer says moderators escalated 4 531 reports of self-harm in two weeks.
Sixty-three of these had to be dealt with by Facebook’s law enforcement response team – which liaises with police and other relevant authorities.
Figures for this year show 5 016 reports in one two-week period and 5 431 in another.
This information comes after The Guardian was privy to more than “100 internal training manuals, spreadsheets and flowcharts that give unprecedented insight into the blueprints Facebook has used to moderate issues such as violence, hate speech, terrorism, pornography, racism and self-harm”.
The guidelines include how to deal with match-fixing and cannibalism, too.
Here are more details:
Of course, these are mere guidelines so that all moderators are all on the same page – you can read The Guardian’s full briefing on what they found here.
[source: theguardian&theguardian]
Hey Guys - thought I’d just give a quick reach-around and say a big thank you to our rea...
[imagesource:CapeRacing] For a unique breakfast experience combining the thrill of hors...
[imagesource:howler] If you're still stumped about what to do to ring in the new year -...
[imagesource:maxandeli/facebook] It's not just in corporate that staff parties get a li...
[imagesource:here] Imagine being born with the weight of your parents’ version of per...