With more than 2 billion users stretching across every continent and country, Facebook supports a larger community than any nation-state on earth, but it isn’t built atop any of the political principles that make nation-states work. It’s become the world’s dominant media organization, but it refuses to call itself a media company or take on the responsibilities of a traditional publisher; it frames itself around a social mission, but it’s a for-profit organization that is relentless in its pursuit of growth.
The ambiguity over what Facebook is, and thus how it should be governed, is at the core of the latest Facebook controversy. On Wednesday, CEO Mark Zuckerberg gave an interview to Kara Swisher of Recode. In it, he offered the example of Holocaust denialism as an idea that may be wrong but should be permitted to exist on Facebook:
I’m Jewish, and there’s a set of people who deny that the Holocaust happened. I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong.
The outrage was swift and overwhelming. Zuckerberg quickly, and somewhat ridiculously, apologized. “I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny that,” he said.
But Zuckerberg was always clear about his loathing of Holocaust denial. The issue isn’t what Zuckerberg thinks; it’s what Facebook does — or doesn’t do.
Zuckerberg’s quote has been pulled out of context and sent ricocheting around the world, but how he got to discussing Holocaust denial is important, so let’s retrace the argument.
Swisher asked Zuckerberg about Facebook’s permissive attitude toward Infowars, Alex Jones’s conspiracy-oriented publication. “Make the case for keeping them, and make the case for not allowing them to be distributed by you,” she challenged.
Zuckerberg responded by trying to draw a distinction between two of Facebook’s many, many, many priorities. There’s “giving people a voice” and then there’s “keeping the community safe.” Facebook, Zuckerberg said, takes a hard line on keeping its community safe. “The principles that we have on what we remove from the service are: If it’s going to result in real harm, real physical harm, or if you’re attacking individuals, then that content shouldn’t be on the platform.”
But when it comes to giving people a voice, Facebook intends to be permissive. “The approach that we’ve taken to false news is not to say: You can’t say something wrong on the internet,” Zuckerberg continued. “I think that that would be too extreme. Everyone gets things wrong, and if we were taking down people’s accounts when they got a few things wrong, then that would be a hard world for giving people a voice and saying that you care about that.”
The distinction Zuckerberg is drawing here is not so clean as he suggests. Indeed, it may not exist at all. He’s defining threats to safety as direct incitement to violence or panic — ordering a hit, organizing a riot, yelling “fire” in a crowded theater. In contrast, he’s defining false or hateful rhetoric as mere self-expression — upsetting, offensive, but not dangerous.
Publishing the anti-Semitic text The Protocols of the Elders of Zion would, in this construction, fall under the category of “being wrong on the internet,” but popularizing The Protocols of the Elders of Zion has led to violence again and again and again. The Pizzagate theory claiming a child sex ring was being run by Democratic politicians out of a DC pizza shop was an absurd conspiracy, but it led to someone showing up at the pizza shop with a rifle.
Violence requires context, and ideas and information create that context. Fake news and hateful rhetoric may stop just short of direct incitement to violence, but they’re the dry tinder that makes someone else’s call to violence catch fire.
Among the many conspiracies Infowars has promoted is the idea that the Sandy Hook shooting didn’t happen — and that idea has led to the harassment of the bereaved parents of murdered children, as Infowars’ viewers try to expose them as actors. And Swisher used that in her follow up.
“‘Sandy Hook didn’t happen’ is not a debate,” said Swisher. “It is false. You can’t just take that down?”
This is the point where Zuckerberg tried to take the spotlight off the specific question of Infowars and its sins and move the discussion onto stronger ground by giving the hypothetical, but more personal, example of the Holocaust. It is a moment that will give his PR team nightmares for years. But it reveals quite a bit about how Zuckerberg is thinking about the organization he controls.
What is Facebook being when it lets Infowars, or Holocaust deniers, peddle their conspiracies on the site? There are many options here, but let’s consider three:
1) Facebook is an “open platform”: This is the oldest and most common theory of social networks — that they’re neutral spaces where anyone can speak. To some degree, that’s even true: Most anyone can sign up for a Facebook account and blast their missives to friends around the world. In this theory, Facebook can’t start making decisions about which content to permit because then it would be implicitly endorsing all the content it permits.
But at this point in Facebook’s evolution, the “open platform” excuse has long lost its power. Facebook is making critical choices all of the time. The visibility of posts is driven by Facebook’s newsfeed algorithms, the content is governed by Facebook’s code of conduct, and a publisher like Infowars uses a different kind of Facebook page altogether.
2) Facebook is a publisher: If Facebook is a publisher, the way Vox Media or MTV or Condé Nast is, then it bears responsibility for what it publishes. And sometimes Facebook clearly is a publisher: in its new Facebook Watch program, for instance, Facebook is paying other companies to produce video content that Facebook will publish on its new video platform. I don’t expect Facebook to pay Infowars, or any Holocaust deniers, to make a show for Watch.
What’s trickier is whether Facebook is acting as a publisher when it gives corporate entities access to special pages and tools designed to let Facebook host, promote, and advertise against their work. In that case, I think Facebook is acting as a publisher and has a responsibility to see those tools confined to responsible actors, though how to define a responsible actor is difficult, dangerous territory. No one said being a publisher is easy! But so far, Facebook hasn’t taken on this responsibility, at least not to any serious degree.
3) Facebook is a government: With more than 2 billion people using its a service, there’s an argument that the thing Facebook is most like is a government. And governments routinely make trade-offs like prizing free speech, knowing that much of that speech will be abhorrent and even dangerous, recognizing that the gains of open expression are ultimately worth it. A government — at least the US government — would make the decision Zuckerberg is struggling with crisply: Both Infowars and Holocaust denial are permitted because the consequences of their prohibition are more dangerous than their presence.
This is, I think, the closest thing to the model Zuckerberg is mentally operating under — he’s said, in the past, that Facebook is much like a government now, and in the April interview he did with me, Zuckerberg talked in detail about the government-like structures he wanted to build within Facebook:
With a community of more than 2 billion people all around the world, in every different country, where there are wildly different social and cultural norms, it’s just not clear to me that us sitting in an office here in California are best placed to always determine what the policies should be for people all around the world. And I’ve been working on and thinking through: How can you set up a more democratic or community-oriented process that reflects the values of people around the world?
But the problem with treating Facebook like a government is that it’s not a government, and it’s certainly not a democratic government.
Governments don’t fund themselves by advertising against user attention and information. And democratic governments, at least, derive legitimacy and accountability through regular elections that decide the top ranks of decisionmakers.
Facebook has no similar mechanism, so it can’t claim that the decisions it’s making and the trade-offs it’s permitting are actually the ones the community chose. Indeed, Zuckerberg’s unusual ownership structure means that if Facebook is to be seen as a government, it’s best understood as a monarchy.
But even if Facebook was a democracy, which democracy would it be? In a particularly interesting riff, Zuckerberg notes that America’s emphasis on freedom of speech is not shared even by other free, democratic countries:
The US has a very rich tradition of free speech; it is written into the Constitution, free speech, so here, we have a very strong allergic reaction to trying to regulate that. But in almost every other country in the world, while people generally want as much expression as possible, there’s some notion that something else might be more important than speech; so preventing hate or terrorism or just different things.
So you’re already starting to see this; I mean, there was the hate speech law in Germany. I think that there will be additional laws creating responsibility for social networking, and social companies, and internet companies overall to be more proactive in policing terrorism, or bullying, or hate speech, or different kinds of content. And overall, I think that there are good and bad ways to do that, but my general take is that a lot of that stuff can be pretty reasonable.
So even if we buy the idea that Facebook is government-like, given its global scope, and Zuckerberg’s sympathy for different trade-offs around speech, it’s not clear that these issues become any easier to solve. Governing a polity is hard. Governing an international polity, and doing so while making money off the results, poses unique problems of both legitimacy and operation.
In an exchange that’s gotten much less attention, Swisher asked Zuckerberg how he felt about Facebook’s role in the Rohingya massacres. Zuckerberg’s response is, I think, the key to his thinking about a lot of these issues:
It’s not that every single thing that happens on Facebook is gonna be good. This is humanity. People use tools for good and bad, but I think that we have a clear responsibility to make sure that the good is amplified and to do everything we can to mitigate the bad.
“This is humanity.” Zuckerberg’s view is that any platform that supports the interactions of 2 billion people will have, at any given moment, some small percentage of those people doing horrible things on it. That’s not a tech problem; it is, as he says, a human problem. You cannot achieve the scale and centrality Facebook wants without becoming a platform for some of humanity’s darker impulses.
The tension is that while Zuckerberg is certain he wants Facebook to have that kind of scale, reach, and openness, the rest of the world really isn’t. That’s not to say they know where the line should be drawn, or who should be empowered to draw it, but Facebook has become too big for it to continue to exist in a state of conceptual ambiguity, where no one, not even its founder, knows quite what it is or how it should be governed.