health

Attention Economy: How Algorithmic Manipulation Drives Addiction and Radicalizes Users

The science behind how platforms optimize for engagement over truth, creating psychological dependency and information silos.

Author Tristan Harris, Center for Humane Technology
Source https://www.humanetech.com/research
Published August 20, 2025

Overview

Tech platforms employ thousands of engineers specifically to make apps addictive. The business model is simple: maximize time on platform, maximize data collection, maximize ad revenue. The cost to users: mental health, attention span, and vulnerability to radicalization through algorithmic feed optimization.

Techniques Used

  • Variable rewards (notifications at unpredictable times)
  • Infinite scroll (no natural stopping points)
  • Social feedback loops (likes, retweets, comments)
  • Fear of missing out (FOMO) algorithms
  • Dopamine hijacking (designed to trigger reward cycles)

The Results

  • Average teen spends 4+ hours daily on social media
  • Anxiety and depression rates up 65% since platforms adopted engagement optimization
  • Sleep deprivation affecting cognitive development
  • Social skills declining
  • Teen suicide rates correlate with platform adoption timeline

Algorithmic Radicalization

Platforms optimize for engagement, not truth. Extreme content engages more than moderate content. Result: algorithmic radicalization pipeline.

  • Users gradually exposed to more extreme content
  • Echo chambers amplify beliefs
  • Misinformation spreads 5x faster than facts
  • No human oversight, just optimization algorithms

Why This Matters

Who benefits? Tech companies profit from engagement and attention. Users pay with mental health, time, attention, and social well-being.

Discussion

Community-owned networks don’t monetize attention. There’s no profit motive to maximize engagement or radicalize users. No algorithms optimizing for addiction. Users control their own experience, set their own boundaries, and choose their own feed logic.

This is the fundamental difference: when a platform is owned by users rather than advertisers, the incentives completely change. You’re not optimizing for engagement and data extraction—you’re optimizing for community health and user well-being.