Facebook’s ethical failures are not accidental; they are part of the business model

LaRae LongTrendspotting, Wealth of Networks, Latest Headlines, Technology

“Ethical Markets highly recommends this critique of Facebook’s business model, much in line with our own “Steering Social Media Toward Sanity“ and unpacking the “polarization” Facebook causes, compared with  Frances Moore Lappe’s the ten basic issues which 70-80% of Americans agree on!!  Must reading!

~Hazel Henderson, Editor“

By Dave Lauer

Facebook’s stated mission is “to give people the power to build community and bring the world closer together.” But a deeper look at their business model suggests that it is far more profitable to drive us apart. By creating “filter bubbles”—social media algorithms designed to increase engagement and, consequently, create echo chambers where the most inflammatory content achieves the greatest visibility—Facebook profits from the proliferation of extremism, bullying, hate speech, disinformation, conspiracy theory, and rhetorical violence. Facebook’s problem is not a technology problem. It is a business model problem. This is why solutions based in technology have failed to stem the tide of problematic content. If Facebook employed a business model focused on efficiently providing accurate information and diverse views, rather than addicting users to highly engaging content within an echo chamber, the algorithmic outcomes would be very different.

Facebook’s failure to check political extremism, [15] willful disinformation, [39] and conspiracy theory [43] has been well-publicized, especially as these unseemly elements have penetrated mainstream politics and manifested as deadly, real-world violence. So it naturally raised more than a few eyebrows when Facebook’s Chief AI Scientist Yann LeCun tweeted his concern [32] over the role of right-wing personalities in downplaying the severity of the COVID-19 pandemic. Critics were quick to point out [29] that Facebook has profited handsomely from exactly this brand of disinformation. Consistent with Facebook’s recent history on such matters, LeCun was both defiant and unconvincing.

In response to a frenzy of hostile tweets, LeCun made the following four claims:

[READ MORE]