Chloe is training for, arguably, the internet’s most difficult job: full-time content moderator — aka “process executive” — for Facebook (though actually employed by a subcontractor named Cognizant). A key part of the training has involved trying to harden herself against a daily barrage of the internet’s most disturbing posts — all manner of hate speech, torture, violence, and pornography. Now, in a culmination of sorts, Chloe will moderate a Facebook post while standing in front of her fellow trainees.
When it’s her turn, Chloe takes her place at the front of the room. She presses “Play” to begin a clip that neither she nor her classmates have seen before.
The video depicts a man being murdered. Someone is stabbing him, dozens of times, while he screams and begs for his life. Chloe’s job is to tell the room whether this post should be removed. She knows that section 13 of the Facebook community standards prohibits videos that depict the murder of one or more people. When Chloe explains this to the class, she hears her voice shaking.
Returning to her seat, Chloe feels an overpowering urge to sob. Another trainee has gone up to review the next post, but Chloe cannot concentrate. She leaves the room, and begins to cry so hard that she has trouble breathing.
No one tries to comfort her. This is the job she was hired to do. And for the 1,000 people like Chloe moderating content for Facebook at the Phoenix site, and for 15,000 content reviewers around the world, today is just another day at the office.
Chloe’s story begins “The Trauma Floor,” Casey Newton’s disturbing depiction in The Verge regarding “the secret lives of Facebook moderators in America.” That story, and a follow-up piece Newton did in June, titled “Bodies in Seats,” portray the human toll involved in trying to tamp down the internet’s most offensive content.
The Attention Economy
A great deal has been written about the downsides of our new ‘attention economy.’ This piece focuses on just one of those downsides: how social media companies’ choice to make “engagement” their holy grail intersects with a dark aspect of human nature — our deep, sometimes desperate, need for attention.
Our attention hunger goes all the way back to infancy, when it was, quite literally, hunger that made us cry. Childhood continued to reinforce the lesson: over and over, we learned that getting what we wanted first required getting noticed.
Fortunately, most of us also learned early on that getting noticed was easy, and that our needs were well met. We were blessed with moms and/or dads who were generally happy to care for us well, without us having to make a huge fuss. Which meant we could relax, confident that attention and assistance would be there when needed.
But this happy scenario doesn’t cover everyone. For some, attention proves hard to get, and assistance even harder. The assurance that we matter, that we are worth the care we crave, stays elusive. Instead, the seeds of a disordered personality take root. Often, a bully is birthed — someone whose anger at unmet needs for attention is (temporarily) salved by hurting others.
Fortunately, bullies tend to be a self-correcting phenomenon. In the moment, bullying brings both attention, and an object for one’s anger. But the bullying behavior also drives people away. Eventually the bully finds himself (or herself, but it’s usually a male) isolated and alone. At that point, he might wise up and leave the bullying behind. Or not — he might instead become one of the ‘crazies’ everyone learns to avoid.
Oh, right, that was the world we used to know. Now we live in a different world, one dominated by Facebook, and YouTube, and Twitter — a world where the algorithms of engagement mean the loudest, most outrageous voices rule the roost; a world where trolls are applauded, not avoided; a world where hate speech, rather than being shunned, can propel someone all the way to the White House.
The Babel Firebreak
All of which got me thinking about the Tower of Babel story from the Bible. Early in God’s dealing with humans, he caused different human groups to speak different languages as a way to limit the harm they could accomplish, and specifically to limit their capacity to act as if they were gods rather than creatures.
But God always has more than one thing on His mind, and more than one motivation behind His actions. God knows that the larger the audience, the more attention-seeking, bullying behavior gets amplified. So dividing humanity into many discrete tribes, speaking disparate languages, served as fire breaks against some of our darker impulses.
Until now. Until the internet. Until, specifically, internet platforms recklessly put the world’s most powerful megaphone into the hands of those who could and would use it to cause the most harm.
Front Line Casualties
Which is why, of course, we now need ‘process executives’ in Phoenix, and Tampa, and Indonesia, and the Philippines, to struggle daily to keep the internet from being overwhelmed by the filth pouring in. “We”re the front-liners, like the 9/11 first responders,” says Jerome, a former content moderator who until November, 2018 reviewed videos on Periscope, Twitter’s video-streaming app.
Jerome’s assessment might seem a tad melodramatic, until you realize that every day these moderators see things no one should ever see: a child forced into sex with an animal, an Islamic State beheading, people filming their own suicides, an animal being beaten to a bloody pulp, people playing with fetuses, someone chopping off a cat’s face with a hatchet — much of which takes place precisely because the perpetrators know the social media platforms will provide them large and ready audiences. It may be hateful, and false, and violent — but the trolls know, if it engages, it plays. The business model(s) have spoken.
Lester, a Filipino with an engineering degree, was another of these front-line warriors. He was part of the never-ending ‘whack-a-mole’ war against the trolls and other haters who find a welcome haven in Facebook, Twitter, YouTube, Google, and Instagram. But now the war goes on without him. Eventually, like with many others, PTSD symptoms forced Lester off the battlefield.
When asked to estimate the number of violent images and videos he saw over the course of eight months reviewing Twitter and YouTube content, he answered, “too many to count.” One unforgettable video still haunts him. It showed a group of men dragging another man into a forest and repeatedly slashing his throat with a large knife until blood covered the lens. Still, this was just one of the several murders he reviewed every month.
Lester now works in a call center selling life insurance. He encourages his former colleagues to quit content moderation work. “People think, because we’re Filipinos, we are happy people, we can adapt. But this stays in our heads forever.”
Chloe, Jerome, Lester — they’re not just first responders in the war for the soul of the internet, they’re the very real casualties of that war. And their wounds come not just from the trolls who so viciously pollute our public forums, but also from the companies who architect their algorithms to prize engagement above all else.