Information environments shaped by engagement-optimized algorithms create paradoxical outcomes where politically engaged users who consume substantial political content may become more misinformed and polarized than less engaged users who encounter politics less frequently.
The research measured effects among over 1,000 X users during the 2024 presidential election—a period when politically engaged citizens paid close attention to campaign developments. These engaged users presumably consumed substantial political content, yet when that content was algorithmically manipulated to include slightly more divisive material, they became more polarized rather than better informed.
This paradox emerges because engagement optimization rewards emotional provocation over informational accuracy. Users who engage heavily with political content signal to algorithms that they’re interested in politics, prompting systems to show them more political content. But the content algorithms select emphasizes whatever generates engagement—divisive claims, inflammatory rhetoric, and emotionally provocative framing—rather than accurate, nuanced information.
Highly engaged users thus get fed concentrated doses of polarizing content customized to maximize their emotional reactions. They may feel informed because they consume substantial political information, yet the information has been filtered through engagement optimization that systematically emphasizes division and de-emphasizes accuracy or nuance.
This creates concerning dynamics for democratic citizenship. Traditional models assume that informed participation improves democracy, encouraging citizens to engage with political information. But if algorithmic environments mean that engagement leads to misinformation and polarization rather than genuine knowledge, then traditional participation norms may need rethinking for digital contexts where information exposure is algorithmically mediated.
When Knowing More About Politics Makes You More Wrong
49