Mindscape 330 | Petter Törnberg on the Dynamics of (Mis)Information

Sean Carroll Sean Carroll Sep 29, 2025

Audio Brief

Show transcript
This episode explores the complex dynamics of social media, questioning whether negative outcomes like polarization and echo chambers are inherent to platform design or solely caused by algorithms. Three key takeaways emerge. First, social media's core problems are deeply rooted in its fundamental architecture, not just its algorithms. Second, extremist online communities thrive by transforming individual anxiety into collective anger and a new group identity. Third, Large Language Model-based social simulations represent a powerful new frontier for understanding complex social dynamics. Research suggests that core features of social media, such as following and sharing, naturally lead to echo chambers, attention inequality, and the amplification of extreme voices. Even without engagement-maximizing algorithms, problematic dynamics emerge. This implies that simply removing or adjusting existing algorithms will not fully resolve the underlying issues. Extremist online communities offer a significant psychological service to their members. They reframe passive anxiety and confusion into active emotions like anger and outrage. This fosters a shared identity, evidenced by members' linguistic shift from individual pronouns to collective ones, cementing a sense of belonging and collective purpose. Computational social science is advancing with novel methods, specifically using Large Language Models as realistic social agents. This groundbreaking approach allows researchers to run controlled experiments on social media platforms in a simulated environment. It provides unprecedented insight into how platform architecture influences collective behavior and emergent social phenomena. The challenges within our online information ecosystems appear structural and deeply intertwined with human social behavior, indicating that straightforward solutions are unlikely to succeed.

Episode Overview

  • This episode explores the complex dynamics of social media, questioning whether negative outcomes like polarization and echo chambers are caused by algorithms or are inherent to the platforms' fundamental design.
  • Guest Petter Törnberg discusses his research on how extremist online communities like Stormfront reshape individual identity, channeling members' anxiety into collective anger.
  • The conversation highlights a groundbreaking study using Large Language Models (LLMs) to simulate a "bare-bones" social media platform, revealing that problematic dynamics emerge even without engagement-maximizing algorithms.
  • Ultimately, the discussion suggests that the core problems of our online information ecosystems may be structural and deeply rooted in human social behavior, making simple fixes unlikely to succeed.

Key Concepts

  • Physics Concepts in Social Science: While "physics envy" (inappropriately simplifying social science) is a risk, physics-inspired concepts like emergence and collective behavior are powerful tools for modeling complex social systems.
  • Identity Formation in Online Communities: Extremist groups can systematically alter a person's sense of self, evidenced by linguistic shifts from individual pronouns ("I") to collective ones ("we"), fostering a new group identity.
  • Emotional Drivers of Polarization: Online communities, particularly extremist ones, often function as "emotional talk therapy," providing narratives that reframe members' passive anxiety and confusion into active emotions like anger and outrage.
  • LLM-Based Social Simulations: A novel research method using Large Language Models as realistic agents allows for controlled experiments on social media platforms, testing how platform architecture influences collective behavior.
  • Inherent Problems in Social Media Structures: Research suggests that core features of social media (following, sharing) naturally lead to echo chambers, attention inequality, and the amplification of extreme voices, even without manipulative engagement algorithms.

Quotes

  • At 0:08 - "There's a idea in social science circles called physics envy... It's not supposed to be a good thing. You're not actually supposed to be envious of physics." - Carroll explains the term "physics envy" and its generally negative connotation.
  • At 3:31 - "I don't want to give away too much about Petter's results, but they're not great. They're not very encouraging for those of us who want social media to work." - Carroll foreshadows that the research reveals inherent, often negative, dynamics within social media ecosystems.
  • At 30:37 - "when they first come in, they use I and my and speak of themselves, uh but then over time they start saying we or SF, you know, for Stormfront..." - Törnberg on the linguistic evidence for identity shift within the Stormfront community.
  • At 32:00 - "...the community functions as a kind of, you know, emotional talk therapy that allows them to find new narratives... and resolve this emotional anxiety and turn it into... something active like anger or outrage." - Explaining the psychological function that extremist online communities serve for their members.
  • At 46:23 - "We got these three features...that are widely considered the problematic aspects of social media... it does imply that removing them will not completely solve the problem." - Summarizing the crucial finding from his LLM simulation: the core problems of social media may be structural, not just algorithmic.

Takeaways

  • The problems with social media are deeper than just algorithms; polarization and echo chambers are emergent properties of the basic platform architecture, and simply removing engagement algorithms is not a complete solution.
  • Extremist online communities are effective because they provide a powerful psychological service, transforming individual anxiety into a shared identity built around collective anger and outrage.
  • Computational social science using LLM agents is a powerful new frontier for understanding complex social dynamics, allowing researchers to run controlled experiments on systems that were previously impossible to model accurately.