What can we do about society’s ‘information disorder’?

Jay OwenGlobal Citizen, Trendspotting, Wealth of Networks, Latest Headlines

“Ethical Markets welcomes this new Commission on Information Disorder, which focuses on the same key issues we cover in “Steering Social Media Toward Sanity“ and our forthcoming “Enslaving Ourselves to Infotech Speed and Virtual Reality: Progress or Collective Insanity?”, Dec 4, 2021.

Hazel Henderson, Editor”

By Mathew Ingram

In January, the Aspen Institute set up a Commission on Information Disorder, and announced a star-studded group of participants—including Katie Couric, former global news anchor for Yahoo; Jameel Jaffer, executive director of the Knight First Amendment Institute; Yasmin Green, director of research at Google’s Jigsaw project (who took part in CJR’s symposium on disinformation in 2019); Alex Stamos, founder of the Stanford Internet Observatory; Dr. Safiya Noble, co-founder of UCLA’s Center for Critical Internet Inquiry; and  Prince Harry, the Duke of Sussex—to look at solutions to the problem of rampant disinformation. The commission was funded by Craig Newmark, the founder of Craigslist (who is a member of CJR’s Board of Overseers). On Sunday, the group released its final report, with 15 recommended steps that it says could be taken by governments, technology companies, and others to help address the societal problems driven by mis- and disinformation.

In their introduction to the report, the commission’s three co-chairs—Couric, along with Chris Krebs, co-founder of Aspen Digital, and Rashad Robinson, president of Color of Change—say information disorder slows down our response time on issues such as climate change, and also “undermines democracy [and] creates a culture in which racist, ethnic, and gender attacks are seen as solutions, not problems.” In the past, they write, there was a belief that in order to fight bad information, all we need is more good information; however, “in reality, merely elevating truthful content is not nearly enough to change our current course.” In some cases, promoting corrective information involving hoaxes or conspiracy theories can actually exacerbate the problem, as Data & Society researcher Whitney Phillips (now a professor of media studies at Syracuse University) pointed out in her 2019 report, “The Oxygen of Amplification.”

The Aspen report notes that “there is an incentive system in place that manufactures information disorder, and we will not address the problem if we do not take on that system.” Some of the major players in that incentive system, according to the group, are large tech platforms such as Facebook, which it says have “abused customers’ trust, obfuscated important data, and blocked research.” The commission mentions one example CJR has also highlighted: the decision by Facebook to shut down a research project run by scientists from New York University by turning off their access to the social network. “Critical research on disinformation—whether it be the efficacy of digital ads or the various online content moderation policies—is undercut by a lack of access to data and processes,” the report states. Several of its recommendations are aimed at solving this problem, including one that asks the government to require platforms to “disclose certain categories of private data to qualified academic researchers, so long as that research respects user privacy, does not endanger platform integrity, and remains in the public interest.”

The report recommends government support for local journalism, including the Local Journalism Sustainability Act, which proposes that federal tax credits be provided as a way to subsidize local news subscriptions. The commissioners also argue that the industry needs to “adjust journalistic norms to avoid false equivalencies between lies and empirical fact in the pursuit of ‘both sides’ and ‘objectivity,’” a topic CJR has also covered in-depth both in the magazine and through our Galley discussion platform. In addition, the report notes that cable news, podcasts, YouTube, and talk radio “all play a unique role in inflaming disinformation and too often fail to hold accountable those who spread false statements on-air,” and that there continues to be a tension in the media between “the drive to maximize profit and the imperative to serve the public good.”

The Aspen report notes that disinformation is “a complex problem that didn’t begin with the Communications Decency Act of 1996 nor with Facebook’s founding in 2004, and will not be solved with mere cosmetic tweaks to certain algorithms.” Stll, the group does take a crack at trying to fix Section 230, a clause in the Communications Decency Act that gives digital platforms immunity from liability for the content they carry—and, theoretically, for the decisions they make about what to highlight with their algorithms. (CJR has hosted a number of discussions on Galley about the challenges of Section 230). The commission recommends that Section 230 be amended to “withdraw platform immunity for content that is promoted through paid advertising and post promotion,” and also to remove immunity protection from “product features [and] recommendation engines.” Daphne Keller of Stanford’s Center for Internet and Society, however—who is quoted in the report—has raised concerns about the clash between these kinds of attempts to regulate algorithms and the First Amendment.

Perhaps the most ambitious, and potentially controversial, of the report’s recommendations comes at the very end: a proposal that Congress create and fund an independent non-profit organization that would be mandated to “invest in systemic misinformation counter-measures,” by funneling money from something called a Public Restoration Fund into research, education, and institutions such as libraries, hospitals, schools, and local news outlets “with an emphasis on community-level protections against misinformation.” This effort could be funded, the commission says, by general taxes, voluntary investment from tech companies, taxes on social-media ads, and FTC fines. Not only that, but the report also recommends that Congress look into ways to “compensate individuals and communities who have been harmed by mis- and/or disinformation.” What exactly that compensation plan might look like isn’t clear.