Facebook pulled down more than 3 billion fake accounts from October to March, according to a report released Thursday by the social network.
That’s a record number of fake-account takedowns by the world’s largest social network, illustrating the challenges Facebook faces as it tries to police hate speech, nudity and other offensive content that flows through its site.
The company estimates about 5% of its monthly active users are bogus. About 2.38 billion people worldwide log in to Facebook every month.
“For fake accounts, the amount of accounts we took action on increased due to automated attacks by bad actors who attempt to create large volumes of accounts at one time,” Guy Rosen, Facebook’s vice president for integrity, said in a blog post.
Rosen said during a conference call that a large number of the fake accounts were created by spammers trying to evade the social network’s detection.
In the six months prior to October, Facebook pulled down about 1.5 billion fake accounts. The company said it remains “confident” that most of the activity and people on Facebook are real. It removes the fake accounts before users are exposed to them, the company said in a separate blog post. The social network caught most of these fake accounts quickly enough that most never became “active” and weren’t counted as part of Facebook’s overall user numbers, the company said.
Though fake accounts might be abusive, they also include “user-misclassified accounts” such as when someone sets up a profile instead of a Facebook Page for a pet. A Facebook page is similar to a profile but is used for, among other things, businesses, public figures, organizations and pets.
For the first time, the company released data about how much of the removed content inspired appeals from users and how much was restored as a result. The report also included new information about the volume of posts the company took action against for attempting to sell products that aren’t allowed on the platform, such as drugs and firearms.
The company took action on 1.4 million pieces of content that tried to sell guns and 1.5 million pieces of content that tried to sell drugs.
Facebook’s report comes as it’s trying to set up athat’ll decide what content gets removed or stays up on the social network after a user appeals. The social network has rules against hate speech, nudity, violence and other offensive content. But conservatives allege that Facebook is censoring , which the social network has repeatedly denied doing.
Meanwhile, Facebook could face a record fine of up to $5 billion from the Federal Trade Commission, the US agency that’s investigating the social network’s alleged privacy mishaps. Lawmakers and even some of the company’s own co-founders are asking US regulators to. CEO Mark Zuckerberg the idea, but he’s said he’s open to including around content moderation.
At the same time, Facebook is doubling down on messaging, groups and ephemeral content as users share more privately. That shift could make it harder for the company to detect harmful content as it tries to balance safety with privacy, Zuckerberg said during the conference call.
“It’s not clear on a lot of these fronts that we’re going to be able to do as good of a job on identifying harmful content as we can today,” Zuckerberg said.
On Thursday the Facebook Data Transparency Advisory Group, or DTAG, an independent group of experts established last year, also released its review of how Facebook enforces and reports on its community standards. Overall, the advisory group found that Facebook’s system for enforcing its community standards and its review process — which includes a combination of automated and human review — is well designed.
The group still made 15 recommendations for the site, which Facebook said fell into three categories. DTAG asked for more metrics to show the social media site’s efforts to enforce its policies. The metrics would include how accurate the enforcement is and how often people disagree with Facebook’s decisions. Facebook should also better explain its current metrics — what type of violation is most common, how much content is removed and more. The group also wants Facebook to make it easier for users to stay up-to-date on policy changes and let them “have a greater voice” in what content isn’t OK on the site.
“It’s important that we aren’t grading our own homework here,” Zuckerberg said.
Originally published May 23 at 9:19 a.m. PT
Updates, 9:43 a.m.: Includes more background about Facebook’s community standards enforcement report; 9:58 a.m.: Adds more background about fake accounts; 12:13 p.m.: Includes remarks from conference call and more background.