Facebook Content Moderators Job

The Worst Job in Tech: Facebook Content Moderators


Let's try this.

Imagine a job where you're bearing witness to the worst kind of violence, abusive behavior, conspiracy theories and fake news.

You're working under pressure with managers tracking every minute of your work.

You're forced to keep your mouth shut, unable to tell your family and friends about what you have to deal with.

Done?

Welcome to the world of Facebook content moderators — one of the world's worst tech jobs on the planet.

The criticism and the reaction

The world we live in is far from ideal and social media made it all much clearer.

It took some time, but the major social media sites like Facebook, Youtube, Instagram, Twitter acknowledged the need to moderate their content.

Following Mark Zuckerberg's 2017 promise to expand Facebook’s community operations team, the tech company rushed to have as many as 30,000 people working on safety and security around the globe by the end of 2018.

Full-time employees? Just a fraction. The vast majority of them are contractors hired through a large professional services companies like Cognizant, Genpact or Accenture.

And thanks to The Verge journalist Casey Newton, we now know much more about the working conditions in Phoenix, Austin and Tampa content moderation sites.

facebook content moderators

Courtesy of TIME

A job that needs to be done

Deciding what does and doesn’t belong online has become one of the fastest-growing jobs in the tech world.

But perhaps also the most appalling. 

Forget the censorship.

Show off your job achievements on your resume.

Create resume

Trying to cope with more than a million reports of potentially objectionable content a day, they are hunting for pornography, racism and violence in a torrent of posts and videos flagged by other members of the social network or by the Silicon Valley giant's artificial intelligence tools.

They have the power to quickly decide what stays up and what comes down, protecting Facebook's 2.3 billion users from exposure to humanity's darkest impulses. 

Reviewing posts to help other people have a comfortable experience on Facebook may sound heroic — but comes at a horrifying cost.

Facebook Content Moderators

Courtesy of Verge

Workplace conditions

At first sight, Cognizant's offices look perfectly fine.

But compared to Facebook's Menlo Park HQ employees who work in an ultra-modern, airy building, most content moderators don't have a permanent desk.

All they have to get away with is two 15-minute breaks and a 30-minute lunch. The contractors get nine extra minutes of so-called wellness time per day that they can use when they feel overwhelmed by the emotional toll of the job.

The workplace is a hellish environment of nonstop stress and inadequate support for employees. It's not unusual for workers to find pubic hair, boogers, fingernails and other bodily waste at their workstations.

It's a place where team leaders micromanage each of workers' bathroom and prayer break. Where managers laugh off or ignore sexual harassment and violence threats.

Having to watch an unbearable load of conspiracy videos and memes leads some workers to adopt fringe views. After just a few months at the job, it's very easy for them to embrace beliefs of the Flat Earthers and Holocaust deniers. Or those who no longer believe 9/11 was a terrorist attack.

facebook content moderators

The hideous content

The posts and videos that need to be moderated record atrocious acts of brutality and viewing them can be extremely detrimental to physical and mental health.

Content moderators reported having to watch people throwing puppies into a raging river. Putting lit fireworks in dogs’ mouths. Chopping off a cat's face with a hatchet. Mutilating the genitals of a live mouse. Or even throwing animals into microwaves. 

Others described scenes where people were playing with human fetuses, encouraging toddlers to smoke marijuana, committing suicides or performing beheadings.

facebook content moderators

Facing the consequences

There are 15,000 content reviewers around the globe working around the clock. These guys are evaluating posts in more than 50 languages at more than 20 sites around the world. 

What they have in common is that they all have to sign a non-disclosure agreement in which they pledged not to share the details of their work.

And that's really hard — the job takes on an immense emotional toll. The inability to discuss the stresses and strains with their loved ones leads to increased feelings of isolation and anxiety.

Perpetually teetering on the brink of chaos and cherishing insane turnover rates, it's a place where you can get fired for making just a few mistakes a week. Where you can't help but fear former colleagues who may come back seeking vengeance.

In stark contrast to the perks and benefits lavished on Facebook employees, your time is managed down to the second and you're never allowed to forget how easily replaceable you are

What's more, the regular exposure to disturbing content leads to high levels of burnout and attrition on the job.

Moderators are crying. Throwing up. Breaking down. Making dark jokes about committing suicide. Smoking weed during breaks. Having sex in the lactation room. Doing just about anything to numb their emotions and satiate that desperate need for a dopamine rush.

And this doesn't stop when they quit. Trauma symptoms haunt them long after they leave the job. They can't sleep for more than two or three hours a night and are often diagnosed with post-traumatic stress disorder and related conditions.

Not all jobs suck.

Find a better one with a resume that stands out.

Create resume

Worth the money?

Compared to an average Facebook employee who has a total compensation of $240,000, content moderators that work for Cognizant make just $28,800 per year — which hardly beats the federal poverty level for a family of four. 

Paying $12-15 per hour, however, the job still looks more attractive than most retail jobs to people laboring for the minimum wage or recent college graduates with no other immediate prospects.

But do they really know what's at stake? Facebook content moderation has the power to threaten the workers' sanity and may, ultimately, take away their life.

And that's the very sad price the rest of the world pays for relatively safe social media space.

With a competitive resume, you won't have to settle for a job you don't love. Before crafting one, check out our extensive collection of resume samples covering a wide range of positions.

  • Jakub Kapral, Career and Resume Writer at Kickresume
  • Jakub Kaprál
    Career & Resume Writer
    Jakub Kapral is a former professional linguist and a career writer at Kickresume. He has written almost 100 diligently researched resume advice articles and his texts are visited by thousands of people every month. Jakub is a natural teacher who looks to help those who want to enhance their career prospects. He's also an avid drummer and a proud father of two.

Share this article

Join our newsletter!

Every month, we’ll send you resume advice, job search tips, career hacks and more in pithy, bite-sized chunks. Sounds good?