Researchers: X Algorithm Pushes Users Conservative, Downranks News

Researchers: X Algorithm Pushes Users Conservative, Downranks News

I scrolled past three similar posts in under a minute and felt my attention tilt. You only notice the tilt when your feed has already rearranged what matters. I kept scrolling — and then the change stopped feeling accidental.

I’ve followed social platforms for years, and I’m telling you: feeds are not neutral. I want to show you how a team of economists and computer scientists from Bocconi University, the University of St. Gallen and the Paris School of Economics used an experiment to map what X’s algorithm actually promotes — and why that nudge matters for what you believe.

I opened a feed and it behaved like a different product

The platform that was once Twitter added an algorithmic “For You” page in 2016, and after Elon Musk bought it in 2022 that algorithmic feed became the default, according to reporting from TechCrunch. Musk later called the feed “purely” AI-led, pointing at Grok as the engine behind recommendations. What you see now is not just what people you follow post; it is what an AI decides to amplify.

That amplification feels subtle at first — like a magnet pulling iron filings — until your attention map is reshaped without you realizing it.

Does X’s algorithm favor conservative content?

The short answer from a new Nature study is yes. Researchers tracked the “For You” feed and found a measurable tilt: conservative posts were roughly 20% more likely to appear in algorithmic feeds, while liberal posts were only about 3.1% more likely to show up. The algorithm’s choices aren’t symmetric.

Researchers flipped timelines and watched behavior change

A research team ran a seven-week randomized trial with U.S.-based X users, assigning each person either the algorithmic feed or a chronological feed and then switching some participants. The experiment combined survey questions about political beliefs with direct observation of what appeared in users’ feeds.

The results were clear: the algorithmic feed was more engaging and reshaped attention. Traditional news items showed up roughly 58% less often in algorithmic feeds, while posts from political activists appeared 27.4% more frequently and entertainment accounts 21.5% more often. Those shifts are not abstract numbers — they alter the building blocks of what people consume.

How does this affect news visibility?

If you work in news, this should feel alarming. The study found traditional media presence in feeds was dramatically reduced under the algorithm, which explains why newsroom links and reporting often fail to reach the same eyeballs that activist or entertainment posts do.

People began following different accounts after the switch

When users were moved from a chronological timeline to the algorithmic feed for seven weeks, many began following more conservative political-activist accounts. The change wasn’t mirrored in follows for liberal accounts or news outlets, which suggests the algorithm nudged people toward a particular slice of the political ecosystem.

On survey measures, those switched to the algorithm reported greater concern for conservative policy priorities such as immigration, expressed more negative views about the criminal investigations into Donald Trump opened in 2023, and held more pro-Kremlin stances on the Russia-Ukraine war. The feed produced a feedback loop — a hall of mirrors that reinforced certain messages and not others.

Can algorithmic timelines change political views?

Yes. In this controlled setting, exposure to the algorithmic feed shifted attitudes in measurable ways. Importantly, researchers report these shifts can outlast the experiment because the algorithm changed who users followed — and who you follow often determines what you keep seeing.

Investigations and reporting had already raised alarms

Journalists and investigators had noticed patterns before the paper landed. A Sky News probe and other reporting suggested right-wing and extreme content gained visibility, and French authorities recently raided X’s local offices as part of a year-long investigation into potential algorithm manipulation to “serve a political agenda.” Musk has called that probe “politically motivated,” and Bloomberg covered the company’s response.

The Nature article supplies a structured, experimental look where earlier stories offered pattern and suspicion. Together they show both anecdote and evidence pointing the same way.

I’m not saying the algorithm is a conspiracy — but you should treat your feed like an editorial product, not a neutral mirror. You deserve to know that the choices inside an app can tilt what you care about and who you listen to. So what will you do the next time your timeline starts nudging your politics in one direction or another?