Science & Technology

Living inside social media eco chambers: Ideological conditioning through information bias

Combined with rampant misinformation that circulates through the internet, most users dive deep into an ideological pitfall: They end up building strong biases and prejudices 

By Abhijit Rajkhowa
Published: Friday 04 September 2020

Throughout history, ideology has been used to justify wars and socio-political sanctions — be it puritanical opposition to orthodox Christianity or Marxist opposition to capitalist ideologues. Ahmad Shah declared Jihad to win over the Marathas and Napoleon went on an expansionist rage in the name of spreading the ideals of the French revolution.

The World War I and II were no less a clash between ideologies and their anti-ideologies than it was a war between the Allied and the Axis.

Even though ideological developments pre-dated his period, it was Destutt de Tracy who first coined the term ‘ideology’ to mean ‘science of ideas’. It was borrowed from the works of John Locke and Condillac. Ideology is now understood as a set of beliefs or ideas, mostly political in nature.

The medium of ideological propagation today is through the virtual world of the internet. The enormousness in size, speed of travel and reach of internet makes it all happen. Niall Ferguson, a historian associated with Stanford University, claimed there has been no disruptive technology than the internet since the advent of printing press.

The printing press enabled Martin Luther’s sermons to rapidly spread throughout Europe, but it also enabled the printing of Malleus Maleficarum (1486), which endorsed the extermination of witches as the only cure for witchcraft. This begs the question: What present or future does the internet hold?

Virtual engagement and its many challenges

Facebook, Twitter, YouTube and all such social media platforms design their algorithms to maximise user engagement and in turn, maximise the amount of time users spend on their platform.

From publicly available information, Twitter uses a ‘relevance score’ based on the tweet, its author and the users themselves to develop a personalised feed for each user.

 This may include multiple sub-factors such as the number of interactions on that tweet, including likes, shares and engagements, how often the user interacts with that particular author and how suitable the particular content is based on the user’s previous interactions.

Similarly, Facebook uses ‘ranking signals’ based on multiple inputs such as the type of media on the post, its popularity and how often the user interacts with similar posts. Somewhat similar criteria are used by YouTube to rank videos that a user may like based on its popularity and the user’s viewing history.

The algorithms are well-intended and attempt only to maximise the amount of time the user spends on a particular platform, but in the end, these are algorithms, and come with their inherent vulnerabilities.

Regular policing of all posts is not possible due to the sheer size of these platforms. According to the last available statistics, there were more than 500 million tweets per day in 2014, and YouTube received over 500 hours of video every minute in 2019.

Although these companies have strict policies on violations, the final call regarding any infringement takes hours, and at times, only after the damage is done.

In late July 2020, a Facebook video that claimed that hydroxychloroquine is “a cure for the novel coronavirus disease (COVID-19)” and “you don’t need a mask” to slow the spread of the disease got over 20 million views before it was taken down.

The recent Bengaluru riots started from a Facebook post which could have been avoided if the post was taken down in time.

Another systematic issue is the use of fake accounts and bots that rapidly spread the information across the platform. Facebook took down nearly 2.8 billion fake accounts in a year’s time from September 2018 to 2019.

In May and June of 2018, Twitter removed over 70 million fake accounts. There are loopholes in the algorithm, which even after being continually updated, allows the creation of fake and bot accounts whose intention is more often than not malicious.

Certain human biases are projected onto the algorithm creating what is commonly known as ‘algorithmic biases’, but that is a story for another time.

Ideological reinforcement

Most users have some form of ideological inclination, albeit in varying intensity, and the internet has invariably turned into an enabling agent which facilitates an active seek-out of other users who subscribe to the same ideology. Thereafter, information that conforms to their ideological standards starts making rounds within the group of like-minded.

Posts and tweets from authors within these groups surge with active engagement. Bots and fake accounts are created to propagate these ideas even further. Without human interference and regulatory policing, the algorithms perceive these as popular posts and flood the feed of the user group.

The more they use social media to view ideologically biased content, the more the platform will revert such content to the user. Thus, the inherent vulnerabilities and loopholes aid in creating a positive loop of information flow that reinforces the ideological inclination of the users on the internet.

Combined with rampant misinformation that circulates through the internet, most users dive deep into an ideological pitfall. They end up building strong biases and prejudices and once their alignments and beliefs are strong enough, fact-checking and materials that are contrary to such biases end up becoming the misinformation. Such is the irony.

What lies ahead?

A rampant proliferation of media platforms and social media apps are occurring in our times. Apart from ideological conditioning, there are legitimate concerns that these platforms are used for interference in electoral politics and sharing of anti-social, vulgar and offensive content. Certain giants such as Facebook are also being investigated for anti-trust and use of monopolistic trade practices to intimidate rivals.

At this stage, it must be understood that the internet and social media is a double-edged sword. Black Lives matter and pro-democracy protests in Hong Kong got their due share of recognition only through social media.

The difficultly arises when narratives are twisted and manipulated to suit the needs of a few. Pumping money enables even more manipulation. One such example is Soul Publishing based out of Cyprus which at one point had over 35 channels on YouTube.

Their most popular channel is “5-minute craft” with the third-highest following on YouTube. It posts highly questionable videos on ‘everyday tricks’ that were described as “bizarre” and “cringey”. Soul publishing exploited the algorithmic loopholes to earn tens of millions and supposedly, spread politically curated content.

There are also chances that human error or wilful misconduct may creep into the system, as has been alleged by the Wall Street Journal. Parliamentary Standing Committee on Information Technology, led by Member of Parliament Shashi Tharoor, has issued summons to Facebook over these allegations.

Now is an right time to fundamentally change our engagement with the internet and social media in particular.

The IT Panel would be well within its rights to expand this investigation to broadly consider all facets of this issue, including the algorithmic loopholes and biases, to build an Internet that guarantees individual freedom, liberty and privacy without a compromise on ethics of “social media” and its rampant use to spread ideological prejudices

With inputs from Tameem Salman, PhD candidate at IIT Hyderabad

Abhijit Rajkhowa is a mechanical engineer with interest in public policy 

Views expressed are the author’s own and don’t necessarily reflect those of Down To Earth

Subscribe to Weekly Newsletter :

Comments are moderated and will be published only after the site moderator’s approval. Please use a genuine email ID and provide your name. Selected comments may also be used in the ‘Letters’ section of the Down To Earth print edition.