|Who would have thought that things would get so bad we’d forget about Covid? But Covid hasn’t forgotten about us. Keep washing your hands.|
|The Plain View
Mark Zuckerberg has long said he doesn’t want to be the arbiter of truth. You don’t want me to be a censor for billions of people, he argues. He has tried to mitigate his own dominant role in determining whether a questionable piece of content stays up or gets taken down, even setting up an Oversight Board that could potentially overrule him.
But despite all his protestations, Zuckerberg is not only the arbiter in chief of the world’s dominant social media platform, he’s an active one. That was never more clear than in the nearly two-hour remote session he had with thousands of concerned employees on Tuesday, when he defended his decision not to take down, mitigate, or fact-check several posts by Donald Trump that seemed, in the eyes of employees, to violate Facebook’s policies. In a transcript of the session—the leak of an internal meeting was once an unthinkable act of disloyalty at Facebook, but now it’s an inevitability—Zuckerberg talks in detail about how he consulted with key aides and painstakingly analyzed his community standards, all to make the final call himself. In this case, he decided that Trump’s use of the phrase “When the looting starts, the shooting starts” was not a call to violence or a racist “dog whistle,” despite arguments to the contrary.
The drama was heightened by two factors. First, the internal opposition to Zuckerberg’s choices was unprecedented, as employees publicly tweeted their displeasure and staged a “virtual walkout” on Monday. Some even quit the company. Also, a group of the company’s earliest employees published a letter lamenting Facebook’s departure from its original ideals. As I wrote earlier this week, what bothered them was not just the two tweets Trump had cross-posted to Facebook. The frustration came from the fact that, for years now, the “free expression” Zuckerberg celebrates has meant hosting misinformation, hate, and divisiveness.
The second factor is an external threat: a movement to tamper with or repeal legislation that gives Zuckerberg the power to make those decisions without taking legal responsibility for everything that his almost 3 billion users post. That law is known as section 230(c) of the 1996 Telecommunications Act. It frees platforms like Facebook and Twitter of liability for what people share, distinguishing them from publishers like The New York Times or WIRED. But it also gives platforms the editorial discretion to police the content to make their platforms safe and civil. In reaction to the power of big tech companies, some politicians are arguing that platforms should be treated more like publications than, say, phone lines. One is Donald Trump, who last week issued an executive order dictating that the government should strip platforms of that sanctuary status if they’re deemed politically biased. Another declared foe of Section 230 is Joe Biden, though he hasn’t called for a government truth squad like Trump has.
Zuckerberg’s decision on the president’s posts wasn’t affected by Trump’s threatened executive order, but it certainly favored Trump and the conservative cause. More significantly, it was well in keeping with Facebook’s tendency to allow and even promote content that divides and inflames. Zuckerberg tried to contextualize this for his employees, saying that while his free-expression tilt might allow toxic content to thrive, it also gives voice to the powerless, allowing them to post things like video evidence of police brutality. “I would urge people not to look at the moral impact of what we do just through the lens of harm and mitigation,” he told employees.
At Twitter, though, CEO Jack Dorsey did look at Donald Trump’s tweets through that lens. After too long a period of keeping his hands off of Trump’s discordant content, he ordered that Twitter tag two disputed tweets. And Snap’s CEO Evan Spiegel went even farther, removing Trump’s posts from the Discover section of the platform, on the grounds that the president’s words are divisive and racist. In a letter to employees Spiegel explained:
As for Snapchat, we simply cannot promote accounts in America that are linked to people who incite racial violence, whether they do so on or off our platform. Our Discover content platform is a curated platform, where we decide what we promote … This does not mean that we will remove content that people disagree with, or accounts that are insensitive to some people … But there is simply no room for debate in our country about the value of human life and the importance of a constant struggle for freedom, equality, and justice. We are standing with all those who stand for peace, love, and justice and we will use our platform to promote good rather than evil.
Trump supporters—and certainly Trump himself—might complain about what Twitter and Snap did. But the companies are exercising their rights under 230 exactly in the way that the law permits.
Zuckerberg should take note. Yes, it’s crazy for one person to have such massive control over what people say online. But like it or not, our system gives leaders of huge corporations massive power. In his total control of Facebook, he must be the arbiter—of harm. We must demand that he perform that role in the best possible way, minimizing the toxic speech posted by his customers, whether they are peons or presidents. His employees are speaking out. His billions of users should let him know as well. And the government should back off.
Thirteen years ago, I wrote about Facebook for a Newsweek cover story. Only a few months earlier, the company had welcomed all users, not just students, and introduced the News Feed. Its CEO, not yet a billionaire, explained to me why adults would take to the service, and that he wasn’t in it for the money:
Zuckerberg himself, whose baby-faced looks at 23 would lead any bartender in America to scrutinize his driver’s license carefully before serving a mojito, eschews talk about money. It’s all about building the company. Speaking with Newsweek between bites of a tofu snack, he is much more interested in explaining why Facebook is (1) not a social-networking site but a “utility,” a tool to facilitate the information flow between users and their compatriots, family members and professional connections, (2) not just for college students, and (3) a world-changing idea of unlimited potential. Every so often he drifts back to no. 2 again, just for good measure. But the nub of his vision revolves around a concept he calls the “social graph.”
|Ask Me One Thing
Rob of Durango, Colorado, asks, “Is there any social media platform that has reasonable speech standards which it enforces? If so, I’ll go there.”
Rob, the key word in your question is “reasonable.” One person’s reasonable is another’s outrage. In the “Ask Levy” inbox this week people were calling me names for liberal bias while others asked why journalists aren’t screaming loud enough about Donald Trump’s unfitness for office. For instance, if Facebook takes down content charging that vaccines cause autism—a claim science disputes—some people are going to consider it unreasonable. As this week’s Plain View essay notes, speech standards definitely require judgement. If users don’t think the standards reasonable, they have to decide whether ditching the platform is worth losing contact with the friends and family who still use it. The accountability comes from the marketplace, the workforce, and the face that the CEO sees when peering into the mirror.
You can submit questions to [email protected]. Write ASK LEVY in the subject line.
|End Times Chronicle
… This time, in a good way! Forgive my hometown bias, but no one who grew up in Philadelphia thought they’d live to see the day when the statue of the late racist, anti-gay, anti-protest police chief and mayor Frank Rizzo came down.
|Last but Not Least
Besides all else, Zuckerberg is a stubborn fellow. That’s why, in my story about disenchanted employees, I correctly predicted he wouldn’t budge on the Trump posts.
A great guide to attending protests safely.
Here’s some background on section 230.
Adam Rogers deftly decodes the latest study on hydroxychloroquine. Verdict: not miraculous.