Facebook CEO Mark Zuckerberg tried to explain on Friday why his company waited a day to remove the account of a self-proclaimed militia group that called for violence against protesters in Kenosha, Wis.
The company is facing intense backlash from both users and employees after 17-year-old Kyle Rittenhouse shot and killed two people and wounded a third at a protest in Kenosha on Tuesday. Protests began in the city after police shot Jacob Blake, a 29-year-old unarmed Black man, seven times in the back.
But before Rittenhouse took his rifle to the streets of Kenosha, a self-proclaimed militia group had already formed on Facebook. The group called Kenosha Guard created an event on Facebook, which it used to make a call to arms before the protests.
Facebook users the flagged the group and its event to Facebook so that it could be removed, but they were ignored until Wednesday, after three people had already been shot. Facebook employees reportedly slammed the company’s handling of the issue at an internal Q&A with Zuckerberg.
Here are three things Zuckerberg said in a video posted on Facebook on Friday about how the company handled the problem and what it intends to do in the future.
Zuckerberg said that the company had designated the shooting as a “mass murder,” a distinction the company used to remove Rittenhouse’s account from Facebook and Instagram.
Following a review of that account, Zuckerberg said Facebook couldn’t see any signs of premeditation or discussion that revealed Rittenhouse’s plans to open fire. Zuckerberg also said Facebook had found no evidence the shooter was following the Kenosha Guard page or that he was connected to the event the group had created.
Why it failed
Still, Zuckerberg said the Kenosha Guard page and event that the group had posted violated a policy Facebook had put in place a couple of weeks ago banning militia and QAnon groups that may intend to organize violence. But Facebook failed to remove the Kenosha Guard page, even though it had been reported by several users, owing to an “operational mistake,” Zuckerberg said.
“The contractors and reviewers who the initial complaints were funneled to basically didn’t pick this up,” he said. “On second review, doing it more sensitively, the team that is responsible for dangerous organizations recognized that this violated the policies, and we took it down.”
Facebook is looking for and removing a deluge of posts that praise the shooting or the shooter, Zuckerberg said. The company is also obscuring “disturbing imagery” related to the shooting, as is its policy for all graphic images.
Zuckerberg said the company is trying to better enforce its current policies as well as adjusting them to cover a wider range of issues so that it can identify more dangerous organizations—something it failed to do this time around.
“This shows that there is a…continued increased risk through the election during this very sensitive, polarized, and just highly charged time,” Zuckerberg said.