Despite the decline in Facebook’s popularity, 300 million users still upload pictures to the social networking site every day.
Then there are the 714 million comments and 410 million status updates. That’s a lot of humblebragging, baby updates and ignorance about politics and global warming.
Most of it is boring – think aunt Susan’s all-caps status about her “NICE DAY”.
Unfortunately, a small percentage of the posts showcase the very worst that humanity has to offer, and those are the ones that end up in front of content moderators.
The Telegraph reports that Facebook has a 7 500-strong army of moderators who sift through the digital trash every day, sorting the bad from the downright gruesome.
Except in Myanmar – they only had four moderators, which is why Facebook is indirectly responsible for a genocide there.
Unsurprisingly, one of the risks of content moderation is potential post-traumatic stress disorder (PTSD).
Doctors have long warned of the psychological dangers of this line of work, and this week one Facebook employee decided to take action. Selena Scola, who worked as a moderator for the social media giant in California, is suing the company, claiming to have developed post-traumatic stress disorder (PTSD) after being “exposed to thousands of images, videos, and live-streaming broadcasts of graphic violence” during her time there in 2017 and 2018.
So can watching violent content manifest a psychological reaction such as PTSD?
Absolutely yes, says Emma Carrington, a mental health counsellor at Rethink Mental Illness. PTSD is by no means unique to soldiers, she says, naming car accidents, fires, and childhood abuse as common causes. She can easily see how spending each day sifting through extreme content could traumatise a Facebook employee.
The irony here is that moderators are responsible for making sure that the internet is a safe place (or at least a slightly safer place – it’s still terrible) for us while becoming steadily more traumatised themselves.
This raises questions as to whether companies such as Facebook should be held responsible for their employees’ mental health, as well as what could be done to better their work situation.
Inevitably, certain jobs will require staff to face upsetting situations, she says, and in these cases employers have a duty to provide “reasonable adjustments” to help, such as professional counsellors. The UK fire and ambulance services, she says, routinely organise “de-briefs” for their staff after they face a potentially traumatising scenario. Why can’t Facebook do the same?
For some moderators, it’s the sheer boredom that comes with the repetition of the job that drives them up the wall. In 2015, journalist Gareth Rubin spent six months as a moderator for OKCupid.
While he didn’t have to look at much disturbing content, he was very very bored.
“It’s pretty boring after a while, because you see the same mistakes and problems and stupidity and moronic behaviour over and over again. It’s just the sheer repetition that destroys your confidence in humanity.”
Now you know how women feel about the men who message them on dating apps.
Facebook already destroys my confidence in humanity – and that’s the approved stuff. I can only begin to imagine what’s being moderated out.
[source:telegraph]
Hey Guys - thought I’d just give a quick reach-around and say a big thank you to our rea...
[imagesource:CapeRacing] For a unique breakfast experience combining the thrill of hors...
[imagesource:howler] If you're still stumped about what to do to ring in the new year -...
[imagesource:maxandeli/facebook] It's not just in corporate that staff parties get a li...
[imagesource:here] Imagine being born with the weight of your parents’ version of per...