Facebook

PA Images/Sipa USA

Facebook Reveals New Playbook For Fighting Offensive Content

May 15, 2018 - 6:55 pm
Categories: 

MENLO PARK, Calif. (WCBS 880/CBS News/AP) -- Facebook is pulling back the curtain on just how it tries to keep offensive content from popping up on customers’ timelines.

 For years, Facebook has relied on users to report offensive and threatening content. Now, it's implementing a new playbook, as well as releasing the findings of its internal audits twice a year, CNET's Jason Parker reports.

Facebook released its Community Standards Enforcement Preliminary Report on Tuesday, providing a look at the social network's methods for tracking content that violates its standards, how it responds to those violations, and how much content the company has recently removed.

The report details Facebook's enforcement efforts from October to March and covers hate speech, fake accounts and spam, terrorist propaganda, graphic violence, adult nudity and sexual activity.

Here are a few key takeaways:

  • Facebook disabled about 583 million fake accounts and took down 837 million "pieces of spam" in the first quarter of 2018.
  • Facebook says its technology "still doesn't work that well" when it comes to hate speech.
  • 21 million "pieces of adult nudity and sexual activity" were taken down in Q1 2018.
  • In Q1 2018, Facebook removed 3.5 million pieces of violent content, 86 percent of which was identified by the company's technology..

Facebook's vice president of product management, Guy Rosen, said in a blog post Tuesday about the newly-released report that almost all of the 837 million spam posts Facebook took down in the first quarter of 2018 were found by Facebook before anyone had reported them. He said removing fake accounts is the key to combating that type of content. 

Most of the 583 million fake accounts Facebook disabled in Q1 were disabled "within minutes of registration."

"This is in addition to the millions of fake account attempts we prevent daily from ever registering with Facebook," Rosen said in the post, noting that "most of the action we take to remove bad content is around spam and the fake accounts they use to distribute it." 

Facebook is taking a number of different preventative approaches, Wall Street Journal Tech Reporter Deepa Seetharaman told WCBS 880’s Steve Scott. Among them is to hire thousands of content moderators.

“By the end of the year, they want 20,000 people working on security issues, broadly. That includes content reviewers. So they’re definitely throwing a lot of bodies at it. They’re also working on the engineering side,” Seetharaman said. “Facebook is a massive platform. You don’t have enough people in the world who can look at every single piece of content that appears on Facebook, so you have to rely on engineering, and so they have – there are a lot of resources applied to figuring out how to see this content automatically, through artificial intelligence or other types of detection tools.”

The report comes in the face of increasing criticism about how Facebook controls the content it shows to users, though the company was clear to highlight that its new methods are evolving and aren't set in stone, CNET's Parker reports.

The information from Facebook comes a few weeks after the company unveiled internal guidelines about what is -- and isn't -- allowed on the social network. Last week, Alex Schultz, the company's vice president of growth, and Rosen walked reporters through exactly how the company measures violations and how it intends to deal with them. 

The response to extreme content on Facebook is particularly important given that it has come under intense scrutiny amid reports of governments and private organizations using the platform for disinformation campaigns and propaganda. Most recently, the scandal involving digital consultancy Cambridge Analytica, which allegedly improperly accessed the data of up to 87 million Facebook users, put the company's content moderation into the spotlight.

(© 2018 WCBS 880. CBS News and The Associated Press contributed to this report.)