Tag: bias (page 1 of 2)

Twitter’s decline into right-leaning hellsite

I quit Twitter at the start of December. Despite being an early adopter, joining in the same year as my son was born, 15 years later it’s gone from a force for good to a rage machine. I don’t want anything more to do with it.

The study looked at a sample of 4% of all Twitter users who had been exposed to the algorithm (46,470,596 unique users). It also included a control group of 11,617,373 users who had never received any automatically recommended tweets in their feeds.

[…]

The authors analysed the “algorithmic amplification” effect on tweets from 3,634 elected politicians from major political parties in seven countries with a large user base on Twitter: the US, Japan, the UK, France, Spain, Canada and Germany.

Algorithmic amplification refers to the extent to which a tweet is more likely to be seen on a regular Twitter feed (where the algorithm is operating) compared to a feed without automated recommendations.

[…]

The researchers found that in six out of the seven countries (Germany was the exception), the algorithm significantly favoured the amplification of tweets from politically right-leaning sources.

Overall, the amplification trend wasn’t significant among individual politicians from specific parties, but was when they were taken together as a group. The starkest contrasts were seen in Canada (the Liberals’ tweets were amplified 43%, versus those of the Conservatives at 167%) and the UK (Labour’s tweets were amplified 112%, while the Conservatives’ were amplified at 176%).

Source: Twitter’s algorithm favours the political right, a recent study finds | The Conversation

Twitter acknowledges right-wing bias in its algorithmic feed

I mentioned on Twitter last week how I noticed that I keep getting recommended stories about Nigel Farage and from outlets on the political right wing like The Telegraph.

Lo and behold, Twitter has published findings from its own investigation which found that its algorithms actively promote right wing accounts and news sources. Now I hope it does something about it.

Twitter logo

What did we find?

— Tweets about political content from elected officials, regardless of party or whether the party is in power, do see algorithmic amplification when compared to political content on the reverse chronological timeline.

— Group effects did not translate to individual effects. In other words, since party affiliation or ideology is not a factor our systems consider when recommending content, two individuals in the same political party would not necessarily see the same amplification.

— In six out of seven countries — all but Germany — Tweets posted by accounts from the political right receive more algorithmic amplification than the political left when studied as a group.

— Right-leaning news outlets, as defined by the independent organizations listed above, see greater algorithmic amplification on Twitter compared to left-leaning news outlets. However, as highlighted in the paper, these third-party ratings make their own, independent classifications and as such the results of analysis may vary depending on which source is used.

Source: Examining algorithmic amplification of political content on Twitter | Twitter blog

Consensus, legitimate controversy, and deviance

My go-to explanation of acceptable political opinions is usually the Overton Window, but this week I came across Hallin’s spheres:

Hallin’s spheres is a theory of media objectivity posited by journalism historian Daniel C. Hallin in his book The Uncensored War to explain the coverage of the Vietnam war. Hallin divides the world of political discourse into three concentric spheres: consensus, legitimate controversy, and deviance. In the sphere of consensus, journalists assume everyone agrees. The sphere of legitimate controversy includes the standard political debates, and journalists are expected to remain neutral. The sphere of deviance falls outside the bounds of legitimate debate, and journalists can ignore it. These boundaries shift, as public opinion shifts.

Wikipedia

I think the interesting thing right now for either theory is that most people have their news filtered by social networks. As a result, it’s not (just) journalists doing the filtering, but people in affinity groups.