A recent study published in Nature confirms that the algorithmic feed on X (formerly Twitter) can measurably shift users’ political views to the right in a matter of weeks. This finding underscores the powerful, often unseen, influence of social media platforms on public opinion. The research, conducted by scientists at Bocconi University, highlights how exposure to X’s “For You” algorithm can alter attitudes even without users consciously changing their political affiliations.
Algorithmic Exposure Drives Rightward Shift
Researchers randomly assigned nearly 5,000 U.S.-based X users to either an algorithmic or chronological feed for seven weeks in 2023. The results were stark: those exposed to the algorithm were 4.7% more likely to prioritize Republican policy issues like immigration, inflation, and crime. They also showed increased skepticism toward investigations into Donald Trump and a notable shift in attitudes toward the war in Ukraine, becoming 7.4% less likely to view Ukrainian President Zelenskyy positively.
Notably, the study found that even after switching back to a chronological feed, users retained the new follow patterns established during algorithmic exposure. This suggests that the algorithm’s influence isn’t easily reversed—users continue to be steered toward right-wing content even after opting out. The researchers found that conservative-leaning posts were 20% more likely to appear in algorithmic feeds, while liberal posts saw only a 3.1% boost.
Why This Matters: Amplification and Echo Chambers
This study is significant because it provides concrete evidence of algorithmic manipulation. While previous suspicions existed, this is one of the first randomized experiments to demonstrate the effect on a major platform. The findings raise questions about the long-term effects of prolonged exposure to these algorithms, as well as the role of X’s owner, Elon Musk, in intentionally amplifying right-wing content. Musk has publicly endorsed far-right politicians and amplified conspiracy theories, and his influence over the platform’s algorithm is undeniable.
The study also points to the broader issue of algorithmic echo chambers. By demoting traditional news sources while promoting political activists, X’s algorithm creates an environment where users are increasingly exposed to biased or unverified information.
Accountability and Future Research
The researchers emphasize that algorithmic influence extends beyond partisan identity, shaping opinions on current political events. This raises critical questions about accountability and transparency in social media algorithms. As Gauthier concluded, “It’s time people realize that these algorithms shape our societies, and for them to reflect on what kind of influence they are comfortable with, and how we should think about accountability.” Further research is needed to understand the cumulative impact of algorithmic exposure over longer periods and whether similar effects occur on other platforms.



























