Columns
Algorithms, politics and democracy
The algorithms that shape politics must be tamed to preserve the integrity of elections.Bimal Pratap Shah
Algorithms played a vital role in determining the outcome of the 2024 US election. These invisible forces on platforms like X (formerly Twitter), Facebook and TikTok have evolved from tools meant to connect people into powerful influencing forces that determine what people see, believe, and, eventually, how they vote. If we do nothing to curtail their impact, we risk losing the democratic principles that underpin our society.
The social media recommendation algorithms that control our social media feeds are designed to keep people engaged, but they have now become extremely sophisticated and play a central role in shaping public discourse. If a post goes viral, the algorithms decide that more people should see it. People are unaware that an invisible force is shaping political conversation and decisions.
As the election approached, these algorithms didn’t treat all the content equally. Platforms like X and Facebook seem to reward posts that stir anger, fear, or outrage. Unlike thought-out perspectives, such posts spread like wildfire, leading to heated online public discourse. Algorithms widened the gap in an already divided political climate in the US and made it difficult for people to get authentic and truthful content.
However, the influence of AI went beyond algorithms. During the campaign, generative AI tools capable of creating hyper-realistic deepfake content designed to mislead voters flooded social media. Even the fact-checkers could not keep up with the speed of such artificial intelligence techniques. To make matters worse, the social media algorithms implied these viral posts, further confusing the voters.
Traditionally, social media posts seemed to favour Democrats; however, as Elon Musk acquired X, the algorithms were made more transparent but also unpredictable. The more right-leaning posts started to receive visibility, sparking concerns about social media platform's role in shaping the political discourse. Worse, Musk’s tweets and retweets were confusing and more right-leaning, and they were amplified by the platform in the country, which has an extremely polarised political atmosphere.
This amplification of algorithm-driven partisan content to increase high engagement changed how people voted in elections. Political advertisements, viral posts and fake news stories were tailored to specific groups based on psychological profiles, increasing biases and fears. As a result, it was extremely hard for regular people to have rational, cross-ideological dialogue. Simply put, people were flooded with information and could not make rational decisions.
The global reach of social media platforms further complicated things. These platforms aren’t just serving domestic audiences in the United States. Foreign actors are also using them to influence voters. Before the election, US intelligence officials reported that adversaries from Russia and elsewhere had deployed AI-driven tools to manipulate voters in key swing states. Deepfake videos flooded the internet, exploiting the same algorithm designed to amplify viral content, which made disinformation campaigns more effective.
There have been efforts to curb misinformation, but the impact has been low. Social media platforms started using AI tools to detect and limit misleading content, but these tools failed to recognise new forms of deception, especially as bad actors were quick to refine their tactics. AI-driven content will only become more sophisticated; distinguishing fact from fiction will become harder.
The most troubling part of this new reality is the growing sense of powerlessness it creates. Already overwhelmed by the constant flow of information, voters find themselves at the mercy of algorithms that shape their worldview. The foundation of democratic decision-making erodes when misinformation spreads unchecked. The line between truth and falsehood gets blurred in this new environment, leaving helpless voters to navigate a political landscape that has become more of a maze than a space for rational debates.
Fixing this newfound problem of algorithmic manipulation in politics is challenging. Advocates are calling for platforms to disclose how their recommendation systems work and take responsibility for the negative consequences. However, transparency alone might not be enough to tackle this problem. AI is evolving and becoming more complex, which means full transparency could still leave many questions unanswered.
Another suggestion is for the government to regulate social media algorithms. But this also comes with significant challenges as it may stifle innovation or inadvertently create political biases, making it harder for people to express their views freely. Striking the right balance between protecting the integrity of public discourse and preserving freedom of speech will be a major challenge in future. Nepal should start thinking along this line.
This issue ultimately boils down to voter responsibility. To combat the effects of algorithmic manipulation, people must have good judgment skills regarding the information they consume. Digital literacy, the ability to understand how algorithms work and how they shape what we see, will be crucial for future elections. Voters must know how algorithms influence their perceptions and critically evaluate the information that floods their feeds. As Dr Srinivas Mukkamala, a cybersecurity expert, notes, “The first step to regaining control is understanding the game that is being played.”
The 2024 US election was just a reminder of the risks and dangers of the growing influence of algorithms on the political system. Moving forward, the role of social media and AI in shaping democracy will only continue to increase. If we want to preserve the integrity of our elections and ensure that public debate remains grounded in truth, the algorithms that shape our political landscape must be tamed. The question, however, isn’t just who controls these systems but how we, as a society, choose to manage and mitigate their influence for the good of democracy.