In our last newsletter, we unpacked why technology is never neutral. Social media is no exception. Social media doesn鈥檛 simply reflect society; it shapes 蝉辞肠颈别迟测.听
The world we see through social media is distorted, like looking into a funhouse mirror. These distortions are negative externalities of an advertising-driven, engagement-maximizing business model, which affects people and relationships in myriad ways.
鈥
8 WAYS SOCIAL MEDIA DISTORTS REALITY聽
- The Extreme Emotion Distortion 馃サ occurs as users have access to virtually unlimited amounts of personalized, emotional content, any user can find overwhelming evidence for their deeply held beliefs. This situation creates contradicting 鈥渆vidence-based鈥 views, resulting in animosity and fracturing of our collective sensemaking.
- The Information Flooding Distortion 馃く happens as algorithms and bots flood or curate the information users see based on their likelihood to engage with it, resulting in users believing that what is popular (e.g., hashtags, comments, trends) is public consensus, when in fact it can also be manipulated distortion.
- The Micro-Targeting Distortion 馃敩 happens as advertisers send personalized, emotionally resonant 鈥 and sometimes opposing 鈥 messages to distinct groups of people, resulting in individualized micro-realities that can generate social conflict.
- The Moral Outrage Distortion 馃槺 occurs when engagement-maximizing algorithms amplify emotionally charged, moralizing content. This results in polarization, mischaracterizations of 鈥渢he other side,鈥 and the perception of more moral outrage around us than there really is.
- The Engaging Content Distortion 馃ぉ happens when social media platforms incentivize competition to create more viral content. This results in more frequent posting, more hyperbolic language, and more posting of extreme views, including conspiracy theories and out-of-context information.
- The Anti-Journalism Distortion 馃毇 is created as social media platforms force reputable news organizations to compete in an environment that rewards clickbait headlines and polarizing rhetoric resulting in less thoughtful, less nuanced reporting.
- The Disloyalty Distortion 馃槨 happens when users on public social media feeds try to understand or express compassion for the 鈥渙ther鈥 side and are attacked by their 鈥渙wn鈥 side for doing so.
- The Othering Distortion 馃懝 occurs as algorithms amplify divisive, negative, out-of-context content about particular groups. This incentivizes 鈥渙thering鈥 content, causing us to dehumanize others and view them as unworthy of our understanding.
鈥
THE IMPACT
These distortions don鈥檛 just affect individuals. Over time these distortions warp society鈥檚 perception of reality, breaking down our ability to find shared understanding.
Shared understanding is needed for democratic functioning. It enables nuanced discussion, debate, and problem solving across party lines. Yet, today's dominant social media platforms are breaking down these critical capabilities at an alarming pace. This is why social media as it operates today is a threat to open societies worldwide.

鈥
ACTIONS YOU CAN TAKE
We can uphold open society values by enabling an information ecosystem that stewards our capacity for shared understanding rather than optimizing for engagement:
- Curtail the causes through platform design changes that incentivize trust and understanding. For example, introducing friction to limit virality prevents ideas that trigger powerful emotions from spreading quickly and dominating public discourse. For a deep dive, we recommend reading Renee DiResta and Tobias Rose鈥檚 piece, 鈥溾赌嬧赌.鈥
- Address the crises caused by the breakdown of shared understanding. For technology teams, identify crises among both users and non-users, maintain cross-team collaboration, and plan ahead for challenges. For instance,聽consider聽implementing blackouts for features that may cause harm during certain periods (e.g., elections).
- Heal the toxic state of our minds from years of being conditioned to see divisiveness as safe and compassion with the 鈥渙ther side鈥 as risky.听
- Approach mutual understanding as a skill to be developed. and provide powerful insight into how public education can cultivate intellectual humility and establish understanding.
- Rehumanize each other by connecting with values that we share and sharing experiences in order to depolarize our communities. For a bit of inspiration, .听
- Illustrate distortions in order to reveal perception gaps and 鈥渁lternate鈥 realities. For example, participate in a 鈥渞eality swap,鈥 where you swap feeds with another person to see how the reality 水果派ed to them differs from the reality you see.
---
This piece has been adapted from Module 5聽in our recently launched course, Foundations of Humane Technology. If you are involved in shaping tomorrow鈥檚 technology, we welcome you to register for the course.
鈥
WHAT WE'RE READING, LISTENING TO, AND WATCHING
鈥
- Frances Haugen calls for granting civil society organizations聽access to platform data in 鈥.鈥澛87% of Facebook's聽operational budget goes to protect just 10% of its users,聽so civil society groups worldwide have filled in the gap. Granting civil society groups access to platform data would help them uphold platform transparency and accountability.
- Stewarding technology responsibly鈥攁nd humanely鈥攔equires crisis planning. But .听As Will Oremus writes, platforms鈥 rapid, ad hoc responses are setting a dangerous precedent for crises to come.
- Extractive technology depletes finite resources faster than they can be replenished. In our latest聽, CCO Maria Bridge unpacks the types of extractive technologies, how to prevent them, and what we can do to become more aware of their impact. 聽
鈥