Facebook could restrict content to stop violence around presidential election

Facebook says it could aggressively restrict content if the US presidential election sparks violent unrest, according to the Financial Times. Global affairs head Nick Clegg told FT that Facebook was looking at “some break-glass options available to us if there really is an extremely chaotic and, worse still, violent set of circumstances.”

Clegg didn’t discuss what those options were. But he mentioned Facebook’s past use of “pretty exceptional measures to significantly restrict the circulation of content on our platform,” deployed in countries where there is “real civic instability.” An unnamed source said the company had modeled 70 election outcomes and how to respond to them, relying on staff, including “world-class military scenario planners.”

Facebook (among other social networks) has tried to preempt concerns about misinformation, election meddling, and potential calls to violence around the presidential election. It announced in early September that it will stop accepting political ads in the week before Election Day, and it’s promoting its own Voter Information Center with authoritative information about how to vote.

Facebook will also place an informational label on posts that cast doubt on the election’s outcome or prematurely declare victory — an issue that could crop up if large numbers of people vote by mail due to the COVID-19 pandemic, particularly because President Donald Trump has baselessly claimed that mail-in votes are fraudulent.

The company’s efforts also extend beyond American politics, including a push to detect and remove hate speech before Myanmar’s elections this fall.

However, Facebook has repeatedly failed to restrict content that promotes violence or misinformation. A recent New York Times report found that QAnon conspiracy theorists flourished on the platform despite attempts at a crackdown. Its focus on Myanmar comes after military officials used Facebook to foment genocidal violence against the country’s Rohingya minority. And it didn’t remove a self-proclaimed militia event that users accurately warned might lead to violence — something CEO Mark Zuckerberg later called an “operational mistake.”

Originally posted: Source link


Leave a Reply

Subscribe to our newsletter

Join our monthly newsletter and never miss out on new stories and promotions.
Techhnews will use the information you provide on this form to be in touch with you and to provide updates and marketing.

You can change your mind at any time by clicking the unsubscribe link in the footer of any email you receive from us, or by contacting us at newsletter@techhnews.com. We will treat your information with respect.

%d bloggers like this: