
The James Bondesque election of India has already concluded. Err…I mean the Indian Lok Sabha elections of 2024. While it’s dubbed as a huge setback for the incumbent PM Narendra Modi, the arithmetics is far more nuanced than a steep fall in vote share. In fact, the BJP saw only a marginal decline in vote share. From 37.3% of the national votes in 2019, BJP dropped to 36.6%. That’s a drop of just 007 (aka James Bond) percentage.
However, this minor 00.7% fall has disproportionately bruised the BJP’s seat tally. Unlike Bond who lands on his feet every time he jumps from great heights, BJP seems to have fallen on its face. By losing just 00.7% votes, the BJP crash-landed below the majority mark to score 240 seats from its previous 303 — a loss of 63.
In a Parliamentary system like India where winners are determined by first-past-the-post, minor variations beyond the threshold limits can create major shakeup in political outcomes. Even in US Presidential system, we have seen George Bush become the president even though Al Gore secured more votes. The diffusion of Bush voters led to his victory while concentration of support for Al Gore in some States led to his defeat. It’s the distribution of votes which matters as much as existence of support. In modern elections where campaigning is increasingly in digital mode, algorithms have become crucial. In closely fought contests, even minor tweak of algorithms of Big-Tech players can alter the course of the election.
Big Tech Algorithms — Are they the silent voters and influencers in democracies worldwide?
The EVM machine has been at the centre of accusations of vote tampering by the opposition. Algorithms which are much more powerful and subtle has remained a distant second in the list of election controversies in India because it works behind the scenes while you are browsing or scrolling. After the 2016 US elections and Brexit elections, the Cambridge Analytica controversy triggered a row when it was discovered that right wing hawks and data firms had acquired Facebook data to build voter profiles.
While the incident led to privacy concerns which prompted CEO Mark Zuckerberg to issue an apology in his testimony, it also tightened Facebook’s approach to third-party apps. Algorithms remained outside the realm of scrutiny even though it can be as nefarious as data leaks.
Algorithms are the mathematical models which determine what appears on your feed. In case of social media, algorithms define what appears in your time line. In case of search engine, it sorts the search results based on your past searches and many other factors. In case of video platforms or online stores, it’s the recommended video or product which appears when you engage with another video or product.
In general, algorithms are optimised to capture and maximise your attention so that the Big-Tech companies can increase their engagement and thus their revenues i.e Ad revenues. Thus, Video platform algorithms will try its best to hook you into the platform by showing videos that appeal to your taste, interest, bias etc. while social media platforms will do the same with their users. No wonder, every Big-Tech user is in a sort of echo chamber where your past actions determine your “Digital Bubble” “ or “Digital world”. It is this situation which is easy to exploit.
Imagine a political party named “ABC” which has its loyal votebase in a particular country. What happens when a typical ABC voter constantly browsing through “pro-ABC bubble” clicks on a recommended video critical of his party. While a vast majority of loyal users may disregard it or simply move with scepticism, a small chunk may shift their bubble without even realising it.
It’s a scary question to ask in a year where highest number of democracies are going to polls. In 2024, 64 countries (plus the European Union) representing about 49% of the world population will go to polls.
The question to ponder is — can BigTech algorithms influence elections and end up deciding the destiny of half the world? The simple answer is — yes, it can. But nobody knows when it happens, whether it has happened and to what extent and in which democracy. Algorithms are opaque. Algorithms travel seamlessly across borders and are difficult to control and regulate. But they can be agenda-driven and create biases.
Algorithms gets revised regularly to stay-ahead of digital marketers trying to manipulate their knowledge of algorithm to the advantage of their brand. Algorithms are so complex that their interdependence on user-actions can be hard to decipher i.e it’s difficult to draw a line whether it’s the anti-ABC thoughts and action of users which resulted in algorithm showing those videos OR whether it’s the recommended videos which first initiated the change.
Worse, it’s hard and subjective to judge what constitutes an “anti-ABC” video per se. A sane debate by a political analyst examining both good and bad aspects of ABC doesn’t fit in either mold even though it may initiate the thoughts of the users in a particular direction. These subjective aspects are hard to factor when investigating whether the algorithms did it or whether a well-argued political opinion swung that voter into the other camp.
Once the voter swings even slightly to the other side, we know that algorithms will show more of the same type of videos to increase engagement. Such a phenomenon is in line with a report by Pew Research Center where it was observed that YouTube recommendations point to progressively longer videos. Starting with a video of 9.31 minutes in the random walks study, it was observed that algorithm recommended videos of 12.18, 13.32, 14.17 & 14.51. It just proves that even accidentally clicking on or viewing anti-ABC video of just 2 minutes will likely result in algorithm showing longer videos of the similar content. A die-hard ABC supporter thus becomes a new convert to the anti-ABC side.
OpenAI, the creators of ChatGPT recently revealed that it acted within 24 hours to disrupt “deceptive” use of AI to influence Indian elections. The powerful language models were generating comments, articles and social media profiles targeting the BJP. While such outright deception is easier to detect, Algorithms are subtler to discover.
Moreover, India is still at the mercy of tech-giants to prevent deception or even access it. Algorithms need to be transparent so that the Big-Tech companies and nations where these tech players are headquartered don’t have outsized influence on the world. Policies and legal systems need to be put in place that humans don’t tamper or influence the algorithms to suit their political agenda. Democracies across the world should enforce algorithm transparency on the Big-Tech players so that sanctity of the elections are preserved.
—The author, Ankush Tiwari, is Founder and CEO of pi-labs, a technology leader in AI and cybersecurity. The views expressed are personal.
However, this minor 00.7% fall has disproportionately bruised the BJP’s seat tally. Unlike Bond who lands on his feet every time he jumps from great heights, BJP seems to have fallen on its face. By losing just 00.7% votes, the BJP crash-landed below the majority mark to score 240 seats from its previous 303 — a loss of 63.
In a Parliamentary system like India where winners are determined by first-past-the-post, minor variations beyond the threshold limits can create major shakeup in political outcomes. Even in US Presidential system, we have seen George Bush become the president even though Al Gore secured more votes. The diffusion of Bush voters led to his victory while concentration of support for Al Gore in some States led to his defeat. It’s the distribution of votes which matters as much as existence of support. In modern elections where campaigning is increasingly in digital mode, algorithms have become crucial. In closely fought contests, even minor tweak of algorithms of Big-Tech players can alter the course of the election.
Big Tech Algorithms — Are they the silent voters and influencers in democracies worldwide?
The EVM machine has been at the centre of accusations of vote tampering by the opposition. Algorithms which are much more powerful and subtle has remained a distant second in the list of election controversies in India because it works behind the scenes while you are browsing or scrolling. After the 2016 US elections and Brexit elections, the Cambridge Analytica controversy triggered a row when it was discovered that right wing hawks and data firms had acquired Facebook data to build voter profiles.
While the incident led to privacy concerns which prompted CEO Mark Zuckerberg to issue an apology in his testimony, it also tightened Facebook’s approach to third-party apps. Algorithms remained outside the realm of scrutiny even though it can be as nefarious as data leaks.
Algorithms are the mathematical models which determine what appears on your feed. In case of social media, algorithms define what appears in your time line. In case of search engine, it sorts the search results based on your past searches and many other factors. In case of video platforms or online stores, it’s the recommended video or product which appears when you engage with another video or product.
In general, algorithms are optimised to capture and maximise your attention so that the Big-Tech companies can increase their engagement and thus their revenues i.e Ad revenues. Thus, Video platform algorithms will try its best to hook you into the platform by showing videos that appeal to your taste, interest, bias etc. while social media platforms will do the same with their users. No wonder, every Big-Tech user is in a sort of echo chamber where your past actions determine your “Digital Bubble” “ or “Digital world”. It is this situation which is easy to exploit.
Imagine a political party named “ABC” which has its loyal votebase in a particular country. What happens when a typical ABC voter constantly browsing through “pro-ABC bubble” clicks on a recommended video critical of his party. While a vast majority of loyal users may disregard it or simply move with scepticism, a small chunk may shift their bubble without even realising it.
This shift in political allegiance is a “meddling algorithm” because no effort was made by any political parties to win over that voter. It’s an algorithm of a borderless digital world can affect that change. We are already seeing how a 00.7 percentage shift in political mood changed the contours of the Indian government for the next 5 years. What if algorithms are designed to work against a political party and this meddling changes the course of history?
It’s a scary question to ask in a year where highest number of democracies are going to polls. In 2024, 64 countries (plus the European Union) representing about 49% of the world population will go to polls.
The question to ponder is — can BigTech algorithms influence elections and end up deciding the destiny of half the world? The simple answer is — yes, it can. But nobody knows when it happens, whether it has happened and to what extent and in which democracy. Algorithms are opaque. Algorithms travel seamlessly across borders and are difficult to control and regulate. But they can be agenda-driven and create biases.
Algorithms gets revised regularly to stay-ahead of digital marketers trying to manipulate their knowledge of algorithm to the advantage of their brand. Algorithms are so complex that their interdependence on user-actions can be hard to decipher i.e it’s difficult to draw a line whether it’s the anti-ABC thoughts and action of users which resulted in algorithm showing those videos OR whether it’s the recommended videos which first initiated the change.
Worse, it’s hard and subjective to judge what constitutes an “anti-ABC” video per se. A sane debate by a political analyst examining both good and bad aspects of ABC doesn’t fit in either mold even though it may initiate the thoughts of the users in a particular direction. These subjective aspects are hard to factor when investigating whether the algorithms did it or whether a well-argued political opinion swung that voter into the other camp.
Once the voter swings even slightly to the other side, we know that algorithms will show more of the same type of videos to increase engagement. Such a phenomenon is in line with a report by Pew Research Center where it was observed that YouTube recommendations point to progressively longer videos. Starting with a video of 9.31 minutes in the random walks study, it was observed that algorithm recommended videos of 12.18, 13.32, 14.17 & 14.51. It just proves that even accidentally clicking on or viewing anti-ABC video of just 2 minutes will likely result in algorithm showing longer videos of the similar content. A die-hard ABC supporter thus becomes a new convert to the anti-ABC side.
OpenAI, the creators of ChatGPT recently revealed that it acted within 24 hours to disrupt “deceptive” use of AI to influence Indian elections. The powerful language models were generating comments, articles and social media profiles targeting the BJP. While such outright deception is easier to detect, Algorithms are subtler to discover.
Moreover, India is still at the mercy of tech-giants to prevent deception or even access it. Algorithms need to be transparent so that the Big-Tech companies and nations where these tech players are headquartered don’t have outsized influence on the world. Policies and legal systems need to be put in place that humans don’t tamper or influence the algorithms to suit their political agenda. Democracies across the world should enforce algorithm transparency on the Big-Tech players so that sanctity of the elections are preserved.
—The author, Ankush Tiwari, is Founder and CEO of pi-labs, a technology leader in AI and cybersecurity. The views expressed are personal.
Check out our in-depth Market Coverage, Business News & get real-time Stock Market Updates on CNBC-TV18. Also, Watch our channels CNBC-TV18, CNBC Awaaz and CNBC Bajar Live on-the-go!