Journal series reveals concerns, inaction at Facebook

Jay OwenGlobal Citizen, Wealth of Networks, Information Technology Issues, Latest Headlines

“Ethical Markets highly recommends this issue of the Columbia Journalism Review and its coverage of the unacceptable impacts on societies of social media monopolies and their profit-focused, advertising-driven business models. Our coverage of these issues includes:  “Steering our Powers of Persuasion Toward Human Goals”;   “Let’s Train Humans Before We Train Machines““Steering Social Media Toward Sanity“ with our 5-steps to reform.

Hazel Henderson,  Editor “

 

By Mathew Ingram

 

In 2018, Mark Zuckerberg, co-founder and chief executive of Facebook, said that the company was rolling out a significant change to the algorithm that governs its News Feed, in an attempt to encourage more users to interact with content posted by their friends and family, rather than content from “businesses, brands, and media”—including news publishers. One of the reasons for doing this, Zuckerberg said, was a growing body of research suggesting that consuming content from brands and publishers was less beneficial for the well-being of users. However, according to a new report from the Wall Street Journal, the algorithm change didn’t do anything to improve the well-being of users. Rather, according to internal memos reviewed by the Journal, Facebook’s own researchers said the changes were making the News Feed “an angrier place.” According to the Journal, the researchers “discovered that publishers and political parties were reorienting their posts toward outrage and sensationalism,” which boosted comments and reactions and, in turn, were amplified by Facebook’s algorithm.

“Our approach has had unhealthy side effects on important slices of public content, such as politics and news,” researchers at the company said in memos quoted by the Journal. “This is an increasing liability. Misinformation, toxicity, and violent content are inordinately prevalent among reshares.” They worked on a number of potential changes to try and ameliorate the algorithm’s tendency to reward outrage; however, according to the memos, Zuckerberg resisted many of their proposed changes because he was worried they might decrease engagement on the platform.

The memos are part of what the Journal calls “an extensive array of internal company communications” that it gained access to. (It doesn’t say how.) That array has so far prompted three investigative pieces on the company’s practices, of which the News Feed story is the third this week. The first, from reporter Jeff Horowitz, described the impact of a little-known system within the company that allowed VIPs to avoid any repercussions for breaching the platform’s terms of service. The program, known as XCheck (pronounced “cross check”) allows celebrities, politicians, athletes, and other “influencers” to post whatever they want to, with little or no consequences. Although an internal Facebook report seen by the Journal referred to “a select few members” as having this ability, Horowitz reported that, as of last year, close to six million people were covered by the XCheck program.

The second article in the Journal series, by Horwitz, Georgia Wells, and Deepa Seetharaman, focused on internal research that showed Instagram, the image-heavy app owned by Facebook, contributed to mental-health and body-image problems among young women. The Journal cited several years worth of Facebook studies that were shared in presentations posted to Facebook’s internal message board; according to one, among teens who reported suicidal thoughts, thirteen percent of British users and six percent of American users traced the issue to Instagram. “Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” the Journal wrote. The company also found that fourteen percent of boys in the US said Instagram made them feel worse. More than forty percent of the app’s users are twenty-two years old or younger, according to the Journal, and the company is reportedly building a version of Instagram specifically for kids under the age of thirteen. The Instagram research, according to the Journal, “represents one of the clearest gaps revealed in the documents between Facebook’s understanding of itself and its public position.”

In a blog post responding to the Journal’s Instagram story, Karina Newton, Instagram’s head of public policy, said that the company stands by the research—despite what she called the “negative light” the Journal story cast on it —and that the app is committed to “understanding the complex and difficult issues young people may struggle with.” On Twitter, Alex Stamos, the former head of security for Facebook, said senior executives at the company “built big quantitative social science teams on the belief that knowing what was wrong would lead to positive change [but] those teams have run into the power of the Growth and unified Policy teams.”