55JL.Royal win Casino,100 free bonus casino no deposit GCash

Advertisement
X

The Rise Of AI Deception In Elections: Deepfakes, Voice Cloning, And Robocalls Pose New Challenges

Amid the Lok Sabha elections, the use of AI-powered deepfake videos, voice cloning, and personalised robocalls has emerged as a potent tool for political parties, raising concerns about voter deception and the need for ethical guidelines.

Getty Images

Ahead of the Delhi Assembly elections in 2020, the Bharatiya Janata Party shared a video of party leader Manoj Tiwari in English and Haryanvi languages picking holes in the promises made by the rival Aam Aadmi Party and appealing to supporters to cast their votes for the BJP. The clips shared in around 5,800 WhatsApp groups, quickly became popular forwards, without many realising that Tiwari is not fluent in any of the languages he speaks in the videos.

It was the first time that the BJP had used artificial intelligence technology to imitate Tiwari’s voice and generate deepfake videos. “We were merely at the beginning stage of AI seeping into elections, and it was going to change how campaigns work” Sagar Vishnoi, a political consultant who debunked Tiwari’s deepfake video told Outlook.

Now, when the Lok Sabha elections have gripped the entire nation, digital companies and political consultancies are advertising “personalised synthetic media,” “voice cloning for election campaigns,” and bespoke services to win “political battles.”

Divyendra Singh Jadoun, founder of Polymath Synthetic Media Solutions, told Outlook that deepfaked calls in which a person’s voice is imitated using AI tools, are more dangerous than digitally altered videos, as no guardrails exist for deepfake or synthetically generated robocalls imitating another person’s voice.

“The only thing we can do is play a message at the start, telling recipients that the call is not real, but AI-generated,” Jadoun told Outlook. However, such disclosures may not work for an average voter in rural or semi-urban areas who are unaware of concepts like deepfakes, synthetic imitation, and generative AI.

Sumit Savara, a political consultant and marketing chief at Earn up Consultants, agreed there are ethical concerns about disclosures on automated robocalls, but said personalised messages by local politicians during election season were making a huge impact on the ground as it is economical and has wider reach.

“If a local leader calls and addresses a voter by their name, people are more likely to pay attention,” he said. Savera who has offered automated call service to a political party in Rajasthan added that it costs about two rupees per personalised call for a batch of 10,000 voice calls.

Sasidharan Ayyavu, CEO of Madurai-based IndiaSpeaks Research Lab, which recently resurrected deceased politician J Jayalalithaa in a deepfaked video avatar, agreed that personalized calls in political leaders voice adds an element of deep emotional connect with the voters. Although there are no rules and regulations on voice cloning are not clear, he said the use of automated calls yielded impressive results in Tamil Nadu as well as Northern states.

Advertisement

In the absence of policy guidelines and mandatory disclosures, AI-powered voice calls during elections could be used to deceive digitally illiterate voters. “While personalised communication can enhance voter engagement, there's a very fine line between informing voters and deceiving them,” Elle Farrell-Kingsley, an AI ethics expert who has worked with the European Commission's Scientific Advice Mechanism (SAM), told Outlook.

Selmer Bringsjord, Director of the AI & Reasoning Lab at the Rensselaer Polytechnic Institute, believes in the future AI-aligned political calls will get “much deeper, devious, and resistant to probing.” Currently, there is no scope for a two-way conversation between the target voter and the AI imitation of a leader on the phone call. But that could change soon.

Jadoun is experimenting with experimental systems that would allow back-and-forth natural language conversations. It’s a complex pipeline, which involves taking the voice responses of a voter on the call and transcribing it into text. The text is then fed to an AI model to generate a response — the same way you would talk to ChatGPT — which is subsequently turned into an audio response in the cloned voice of a political leader.

Advertisement

Right now, the whole system has only been deployed as a test in a small block in Rajasthan. Jadoun says they are trying to bake in support for over 100 Indian languages and have already reduced the delay between human and machine response to around 300 milliseconds. A robocalling system with a human-like conversation capability in the cloned voice of a leader could be extremely potent and make it extremely difficult for an average person to not get influenced.

Political parties are more likely to use AI in deepfake videos and robocalls as an effective mechanism to reach voters in future considering that elections are held every few months in some or the other corner of the states, experts said. For contesting candidates scoring electoral victories is a matter of prestige and each polls are fought as a do-or-die battle at every level, be it the local panchayat, state assembly or the general elections held every five years.

Advertisement

In this background, Irina Tsukerman, a US national security lawyer and member of the American Bar Association's Science and Technology Section, said there was an urgent need to educate end users on signs of robocalls. “Pushing technology that facilitates identification of such calls will probably limit the impact of the use of voice cloning at least temporarily.”

As demand for the use of AI rises in political circles ramps up to reach maximum numbers of potential voters at a fast pace and economical rates, experts stressed the onus of ethical discretion would rest with consultancies offering these services. But if the history of conflict between Big Tech regulation and profits is any indication, they also lamented that Indian voters are likely to be staring at ominous times ahead.

  • In April, Microsoft warned about deepfaked audio calls in the voice of Taiwanese presidential candidate and billionaire founder of Foxconn, Terry Gou, in which he was heard endorsing his election rival.

  • Moldova — a European nation with a population smaller than most Indian metro cities — was hit by a state-sanctioned deepfake video targeting President Maia Sandu. She debunked the video on Facebook, but warned that more such attacks will keep coming.

  • In May, an AI-altered video of American rapper Eminem surfaced online in which he was heard bashing the ruling African National Congress (ANC) party and endorsing the rival.

  • In Bangladesh, political leaders including Rumeen Farhana, Nipun Roy, and Rashed Iqbal Khan went viral last year with the intent of character assassination. In Indonesia, deepfaked media of presidential candidates has been circulated in the past few months.

  • The most high-profile misuse of AI towards election influence came in January this year, when a deceptive robocall imitating the voice of US President Joe Biden urged people to refrain from voting. Steve Kramer, the political consultant behind the stunt, was indicted in May and slapped with a $6 million fine citing intent to defraud voters. Notably, the FCC banned the use of generative AI in robocalls merely a month after the Biden deepfake call reached voters in New Hampshire.

Advertisement

(Nadeem Sarwar is an independent tech journalist based in Delhi)

Show comments
SC