Sarah Friar, CEO of the neighborhood-focused social network Nextdoor, says the company is to blame following widespread reports that moderators were deleting posts that discussed racial injustice or voiced support for the Black Lives Matter movement. The app will be changing its policies to explicitly allow discussion of the movement in future, and will be offering new unconscious bias training to its unpaid moderators.
In an interview with NPR, Friar said it “was really our fault” these posts were removed, and blamed the actions taken by the company’s unpaid moderators (known as “leads”) on a moderation policy that prohibited discussion of national issues in the app’s local groups.
“We did not move quickly enough to tell our leads that topics like Black Lives Matter were local in terms of their relevance,” said Friar. “A lot of our leads viewed Black Lives Matter as a national issue that was happening. And so, they removed that content, thinking it was consistent with our guidelines.”
According to NPR, a new rule has been added to Nextdoor’s moderation policy to ensure such discussions are not deleted in future: “Black Lives Matter is a local topic.”
Nextdoor has long been the subject of jokes and criticism for its so-called “Karen problem” – shorthand for an abundance of white users who take to the app to complain about trivial issues, from children giggling too much to neighbors who won’t stop brushing their cat.
Behind the memes, though, has always been the more unsettling truth that Nextdoor allows racism to thrive on its platform by taking a hands-off approach to moderation. The company has grown so fast in part because it relies on its own users to remove contentious posts. But Black users say this has created an environment that tolerates racism.
The same “Karens” who get annoyed about noisy children can also be those users who racially profile Black people in their neighborhood and call the police on any “suspicious teens” they see (who are invariably people of color). Nextdoor has arguably exacerbated these problems by offering features like “Forward to the Police,” which let users quickly send an “urgent alert” to law enforcement. This specific tool was removed last month.
Racism fostered on Nextdoor’s platform has attracted new attention after the police killing of George Floyd and subsequent protests against racial injustice swept across the US. Black users who tried to discuss these issues on Nextdoor found they were silenced and their posts deleted. As one user told The Verge last month: “As a black person, I don’t feel safe at all using [the app] for anything … I’m always terrified, thinking ‘Oh my god. I already know what so-and-so thinks of us.’ This is a very horrible situation to be in.”
In addition to changing its moderation policy, Nextdoor says it’s starting a campaign to recruit more Black moderators, and will offer unconscious bias training to all current leads (though it’s not clear whether this training is mandatory). The company says it will also improve the app’s AI systems to more accurately identify explicit racism and “coded racist content.”
However, while AI has been presented as a solution by many social networks criticized for allowing racist or bigoted content on their platforms, experts generally agree that automated systems lack the understanding needed to moderate this content. As Facebook’s Mark Zuckerberg has shown in the past, AI is no solution to the human problems of moderation.
“We’re really working hard to make sure racist statements don’t end up in the main news feed, making sure that users that don’t act out the guidelines aren’t on the platform anymore,” Friar told NPR. “It is our No. 1 priority at the company to make sure Nextdoor is not a platform where racism survives.”
Groups who have called on Nextdoor to take responsibility for the actions of its moderators welcomed the changes, but expressed caution about the impact they might have.
“This is a positive step towards creating a true community forum where all people in our neighborhoods feel safe to participate,” activist Andrea Cervone of the Minneapolis-based organization Neighbors for More Neighbors, which petitioned the company to introduce anti-racism training for moderators, told NPR. “We will be keeping an eye on the company to make sure they continue forward and fulfill these public commitments.”
Originally posted: Source link