A #SocietalArchitecture use case ?

Platform Incentives is one of the key drivers of The rise of online untruths.

Social media platforms rely on engagement-driven algorithms.

Content that provokes strong reactions — whether outrage, fear, or curiosity — tends to spread rapidly, regardless of its accuracy.

Misinformation often outperforms verified news because it appeals to human emotions and biases.


Platform incentives significantly contribute to the rapid spread of online untruths because the business models of social media and content-sharing platforms prioritize engagement over accuracy.

Here's a breakdown of how this works:

  1. Engagement-Driven Algorithms
    • Prioritizing Clicks and Views: Platforms like Facebook, X, and YouTube use algorithms designed to maximize user engagement. Content that generates strong emotional reactions—such as outrage, fear, or surprise—tends to perform well, regardless of its accuracy.
    • Amplification of Sensational Content: Untruths often rely on sensationalism or shock value, making them more likely to be prioritized and displayed in users’ feeds compared to nuanced or verified information.
  2. Revenue Through Advertising
    • Ad-Based Business Models: Platforms earn revenue from advertisements that appear alongside content. The more time users spend on the platform, the more ads they see, and the more revenue the platform generates.
    • Incentive to Keep Users Hooked: False or misleading content often keeps users engaged longer, as it sparks curiosity or emotional investment. This creates a financial incentive for platforms to tolerate or fail to act swiftly against such content.
  3. Virality Mechanisms
    • Sharing and Reposting Features: Platforms are designed to make it easy for users to share content widely. This accelerates the spread of misinformation, especially when users share without verifying the source or content.
    • Echo Chambers: Algorithms curate content based on user preferences and behavior, reinforcing pre-existing beliefs. In these echo chambers, misinformation can spread unchecked, as users are less likely to encounter opposing views or fact-checks.
  4. Lack of Immediate Accountability
    • Scale and Speed: Platforms handle massive volumes of content, making it challenging to monitor and flag all instances of misinformation in real-time.
    • Delayed Interventions: Even when platforms take action to label or remove misinformation, it often happens after the content has already reached a large audience, minimizing the impact of corrective measures.
  5. Profit Motives of Bad Actors Exploiting Platforms
    • Monetized Falsehoods: Some individuals and groups exploit platform incentives by creating and sharing false content designed to attract clicks, followers, or donations. Platforms, in turn, may benefit indirectly from the increased traffic these bad actors generate.
    • Bots and Troll Farms: Coordinated campaigns by malicious actors leverage the same platform incentives to amplify disinformation at scale, often for financial or political gain.

Policy and platform solutions to address the role of platform incentives in spreading untruths requires collaboration among platforms, policymakers, and users, as illustrated in UN Global Principles for Information Integrity - Next Steps:

  • Algorithm Transparency: Require platforms to disclose how algorithms prioritize content and evaluate the impact of these choices on the spread of misinformation.
  • Incentivize Accuracy: Platforms could downrank or demote sensational or false content and reward credible sources with higher visibility.
  • Strengthen Moderation: Improve content moderation systems to detect and remove harmful misinformation more effectively and promptly.
  • User Empowerment: Provide users with tools to fact-check content, flag untruths, and better understand how their engagement influences what they see.