As a content moderator for Facebook, 27-year-old Sarah Katz saw anti-Semitic speech, bestiality photos and even a video of what “seemed to be a girl and boy told by an adult off-screen to have sexual contact with each other.” She reviewed as many as 8,000 posts a day, and had to sign a waiver warning her about what she would encounter. She received little training about how to handle what she was seeing, and a main coping mechanism was to turn around to commiserate with others at Facebook’s headquarters campus in Menlo Park, California. But Katz wasn’t a full-time staff member of the social media platform. Instead, she was “hired by a staffing company that works for another company that in turn provides thousands of outside workers to the social network,” writes The Wall Street Journal. Katz earned $24 an hour. Facebook receives more than a million user reports of potentially objectionable content a day. This makes Katz’s job one of the fastest-growing jobs in the technology world — but also maybe the most grueling. Humans are still the first line of defense in deciding what does and doesn’t belong on the internet. Places like Facebook and YouTube are trying to develop an algorithm, but they are no where near replacing people. Facebook will have 7,500 content reviewers by the end of this year. By 2018, it wants to double the number of employees and contractors who handle safety and security issues to 20,000.
Thanks for reading InsideHook. Sign up for our daily newsletter and be in the know.