The Gatekeepers of Facebook

Discussions about social media censorship have been going on long before Mark Zuckerberg appeared before Congress to testify about Facebook.  Figures like President Trump have brought to the national stage claims that social media giants are disproportionately censoring conservative voices.  What we should all be discussing though, is who exactly are the people that are censoring us?

Last month, employees contracted by Facebook to perform content moderation, broke their non-disclosure agreements (NDAs) to speak with Casey Newton, who then published a 7000-word, pseudo-expose on The Verge.

While Newton’s purpose of his article was to portray Facebook as running a de facto labor camp through contractors like Cognizant, using language like “sweatshop” and that moderators “fear for their lives,” he missed the most important aspect. Facebook moderators have the biggest impact on each user’s Facebook experience, arguably more than designers, developers, and IT security. But they are hired, trained and paid as though their job function were inconsequential.

For maybe the first time, a profile of the average Facebook moderator has been uncovered and it is disappointing. Employees that broke their NDAs to speak with Newton complained about problems found in most low-skilled work environments. Newton’s elite life didn’t prepare him for the reality of dirty bathrooms, demanding bosses, conflicts over office relationships, and productivity quotas that the employees found difficult to reach.

Yet, pictures from the article show bright rooms with large windows, clean workspaces and ergonomic computer chairs.  Newton even acknowledges this, describing what he sees as “neither dark nor dingy.”

Though Newton tried to paint a picture of a sweatshop and employee abuse, he seemed to dance around the biggest point of them all: Cognizant employees are not qualified or prepared to have this much control over our collective social media experience -- and Facebook knows it.

When Facebook gave Cognizant a two-year, $200,000,000 contract and asked them to assist in North American content moderation, a goal of a 95% accuracy score was implemented.  Simply put, when a moderator takes an action on a post, whether censoring it, banning the user, or leaving it on Facebook, every 1 in 10 of their decisions are audited by another employee. The accuracy score is the percentage of times the auditor matches the decision of the moderator.  Cognizant has consistently failed to meet this goal, as their largest site’s genuine accuracy scores are in the eighty-percent range.

In an era when deplatforming and censorship are part of a broad national discourse, having an eighty-percent accuracy rate is simply not acceptable and Facebook’s users should demand better.  

I did an audit of my own support inbox on Facebook, which lists all the times moderators have made decisions on my posts (see below). The accuracy score on my posts is 30%.  The vast majority of my reported posts are overturned when challenged, although sometimes they stay in review indefinitely.  

I’m no internet troll. I use Facebook for personal reasons, to share my articles and the occasional meme. However, I find myself constantly being banned and for very ridiculous reasons, like this selfie which wasn’t overturned and resulted in a 30-day ban:

This poor overall performance is explained by employees that say Cognizant and Facebook seem more concerned with having a pulse at the desk than they are with ensuring that person is right for the job.  Some claimed the tasks they perform were downplayed when applying for the position and others are brought onboard believing they’ll be performing a different job entirely.  

When they realize that they will be exposed to graphic images, pornography, the thoughts of possibly deranged individuals, and other offensive content for 40 hours a week, they become aware that they may not have the resiliency and ability to separate the work from their daily lives. A few even claim to have developed PTSD from the content they have seen.

There are other professions that require resiliency, such as first responders, who face genuine violent imagery in the flesh.  There exists a multi-billion-dollar porn industry and thousands of psychiatrists deal with unstable people that say offensive things daily.

However, these other professionals are beings screened better than the contractors at Cognizant, which is marketing entry-level positions to low-skilled people that will jump at the opportunity to make almost twice the minimum wage pay of their state. New hires are eagerly filling these seats because of the wage-competitiveness in their local labor market, without considering if they’re right for it.  And Cognizant, under pressure to fill its available jobs, clearly doesn’t want that responsibility either.

Cognizant and Facebook seem aware that the wrong people are fulfilling these jobs, evident by their offering of a 24/7 mental health hotline, on-site therapists, yoga rooms, armed security and buckets of Lego blocks for the “extra-hard days.” An internal email showed that managers are focused on unprofessional behaviors like showing up to work in “club attire” and telling inappropriate jokes.   

For some weird reason, we collectively don’t consider the people that have this power to control what content we are allowed to see on Facebook.  Neither is there a call for a safeguard that ensures that individuals’ beliefs, whether religious, racial or political, don’t play a part in what is deemed a violation of Community Standards.  The number one reason for employee turnover is because their accuracy rating is too low.

When someone like Paul Joseph Watson or Laura Loomer gets deplatformed, there is a measurable monetary loss and it is assumed that an executive in corporate Facebook somewhere has considered a lot of factors before reaching that decision.  But when ordinary people are getting up to 30-day bans or their accounts deleted altogether, and often for posts that aren’t actually violating Facebook’s Community Standards, they are missing the connections from events that we primarily use Facebook for. Under a ban, users cannot comment, react, write personal messages or use any other of the application’s features, and that can really interfere with lives as we become more dependent on social media for news, personal relationships, networking and events.

Facebook could easily solve this by auto-blocking individuals from the person whose post they reported. If someone is offended by another person’s content to the point that they are reporting it, they shouldn’t have access to the poster anymore to prevent being triggered again and possible reporting abuse. Cognizant should be keeping track of their own employees’ possible harassment.

Moderators may pass the normal background checks completed by other employers, but Cognizant must be able to pre-screen applicants’ ability to handle reported content before putting them to work.  Their contract might not be enough to bring in the people that would be best for the job but there must be some way to measure whether candidates will have an acceptable future accuracy rating and not develop PTSD.

Connect with Taylor Day on Facebook and Twitter @TABYTCHI

If you experience technical problems, please write to helpdesk@americanthinker.com