Facebook content monitors have their work cut out for them



If you’ve ever spent time on Facebook, you know that you’re likely to see everything from cat videos to police pursuits. But recent events — including a live-streamed murder and a suicide that were broadcast over the social media website — have prompted the Menlo Park-based company hire 3,000 more reviewers. Their job is to monitor and quickly remove content that’s deemed too sexually graphic, too violent, too racist or too just about anything else.

• RELATED STORY: Southern California actor streams his suicide live on Facebook

So who, exactly, are these people and how does this work? Here are some answers:

• The company has yet to indicate whether reviewers will be full-time employees or contractors. They also haven’t revealed how Facebook reviewers will be screened.

• A reviewer’s job is to quickly work through the bottleneck that often results in a gap between reporting questionable content and getting the content taken down.

• Facebook founder Mark Zuckerberg plans to make it simpler to report problems to the company, faster for reviewers to determine which posts violate Facebook standards and easier for reviewers to contact law enforcement if someone needs help.

• The work is increasingly being done in the Philippines because workers there are familiar with American culture but can be hired at a fraction of U.S. wages.

• While a large amount of content review takes place overseas, much is still done in the US, often by young college graduates.

• Most Facebook content isn’t questionable. But with more than 1.8 billion active users (as of the fourth quarter of 2016), there is plenty to wade through.


Source link