Future Tense

Want a Terrible Job? Facebook and Google May Be Hiring.

CAMBRIDGE, MA - MAY 25:  Facebook Founder and CEO Mark Zuckerberg delivers the commencement address at the Alumni Exercises at Harvard's 366th commencement exercises on May 25, 2017 in Cambridge, Massachusetts. Zuckerberg studied computer science at Harvard before leaving to move Facebook to Paolo Alto, CA. He returned to the campus this week to his former dorm room and live streamed his visit.  (Photo by Paul Marotta/Getty Images)
There’s nothing fun about scrubbing filth from Facebook. Paul Marotta/Getty Images

Some of the world’s biggest companies spent much of 2017 cleaning up their own messes. Facebook tried and failed to fix its fake news problem while it simultaneously reckoned with a saturation of Russian government–backed agents masquerading as advocacy organizations in order to push socially divisive political messages. Facebook and Google had to apologize after it was revealed that advertisers could use their platforms to target people based on racist and bigoted interests—like “threesome rape” and “Jews ruin the world.” Google, Facebook, and Twitter all sat through multiple congressional hearings last year to discuss how Kremlin agents used and abused their platforms to manipulate Americans.

One response to these problems has been to hire more humans to vet content. On Tuesday, YouTube said it would be doing just that in response to advertiser complaints that their ads were being placed alongside videos that promote hate speech, violence, or terrorism. YouTube says humans will screen all the videos of channels that are on Google Preferred, which is supposed to help provide advertisers with a selection of YouTube’s most popular content to run ads against.

It’s unclear whether those YouTube workers will be part of the hiring for the 10,000 positions Google said in December 2017 it would fill to help weed out the troubling content that was being monetized on its video service. In November 2017, for example, the company removed more than 150,000 videos of children in distressing, violent, or sexually suggestive situations, and since June, YouTube has said that it has removed ads from millions of videos of kid-friendly characters engaged in violence.

Facebook announced in October 2017 that it was hiring 1,000 more people to work on vetting and taking down problematic ads, like ones bought by a foreign government to sway an election with politically polarizing content. That 1,000 number is on top of the 3,000 additional people who Facebook CEO Mark Zuckerberg said in May 2017 the company would be bringing on following a string of harmful videos that were posted to Facebook, including livestreamed teen suicides and videos of murders.

And at the end of October, Facebook said that by the end of 2018, it would hire 10,000 more people, including contractors, to double its staff working on improving its community safety. (It’s unclear if the company is lumping the 4,000 additional hires to remove ads and do community moderation are part of its push to bring on 10,000 people to tend to Facebook community safety.) Facebook currently has more than 23,000 employees, so these numbers are no drop in the bucket.

Overall, it’s probably a good thing that major tech companies are increasingly aware that algorithms alone can’t solve these very real human-created problems. It was the algorithms, after all, that caused the Facebook and Google ad platforms to suggest to advertisers to target ads that appeal to racists. If there was a person somewhere in the process, those ads probably wouldn’t have been allowed to sail through.

The problem, though, is that traditionally these content moderators are given confusing instructions and a tremendous amount of work to do in a short amount of time, often with only a few seconds to determine whether a post violates the company’s community guidelines. One woman who worked for a staffing company contracted by Facebook told the Wall Street Journal in December 2017 that she had to vet as many as 8,000 posts a day, rife with hate speech, videos of possible sexual exploitation, and violence. It’s not an easy job, and what’s worse, the positions are commonly filled by contractors who don’t get the same infamously lush perks as the company’s full-time employees, like unlimited sick days, gym classes, stock options, and laundry service.

If Facebook and Google are going to staff up their content-moderation and community-safety personnel as much as they claim, bringing in tens of thousands of people to put out their trash fires, the companies should make sure the people they hire know what they’re doing—and that they are well taken care of. Unless they want a replay of 2017, that is.