AI tools help campaigns become more effective for the candidates you like — and those you don’t
Originally published on Global Voices

Illustration by Tactical Tech, with visual elements from Yiorgos Bagakis and Alessandro Cripsta, used with permission.
This post is part of Global Voices’ April 2026 Spotlight series, “Human perspectives on AI.” This series will offer insight into how AI is being used in global majority countries, how its use and implementation are affecting individual communities, what this AI experiment might mean for future generations, and more.
This article was written by Safa Ghnaim in collaboration with Goethe-Institut Brazil and originally published on DataDetoxKit.org. An edited version is republished by Global Voices under a partnership agreement.
During an election, it can seem like political candidates are everywhere, all the time, speaking to every topic you care about. But how do politicians and other influencers seem to know what messages you want to see and hear — and even when and where you want to see them?
The smartphone in the palm of your hand might seem like a crystal ball sometimes. But just like a tricky psychic uses clues about you to tell you what you want to hear, political parties can use your personal data to target you with the messages that are most likely to persuade you. Personal data has been used in elections around the world, including in Chile, Georgia, India, Italy, Kenya, Malaysia, Nigeria, and the United States.
Today, companies are experimenting with artificial intelligence (AI) tools to help campaigns make their messaging and targeting even more effective, convincing, and persistent — for the candidates you like — as well as those you don’t.
Who has the power to sway?
We’re constantly getting direct and indirect messages, nudges, and hints, both online and off, that can shift our opinions, behaviors, and actions. In fact, influence can be exercised in many ways and through various means; it’s not necessarily a bad thing, but it’s important to be aware of it.
While the title of “influencer” commonly refers to social media personalities who have large followings (like on Instagram, TikTok, and YouTube), many others have the power to influence. Influencers can also be public figures, such as celebrities and newscasters. Their influence can be seen and experienced through their choice of words, clothing, and imagery, as well as which stories they prioritize and give attention to.
“Influence” happens not only on social media apps like Instagram and TikTok, but also through the display of top results on search websites like Google, recommendation systems promoting the next video or post, and the focus of certain news headlines in the media. Influential messages can also be shared on popular chat apps like WhatsApp.
A politician is also one type of “influencer” that can have some major sway, even when they’re running for office. Political campaigns invest large sums of money to reach potential voters, so much so that there is an entire industry to help them identify and target specific groups. In fact, there are over 500 documented companies that work in the field of technology-driven political persuasion. This means they sell their services to politicians and political campaigns, claiming they can help influence your opinions — and your vote.
Companies that work in political persuasion are currently experimenting with AI at scale to see how they can make the traditional influencing methods even cheaper, faster, more convincing, and more automated. This makes this work even more obscure, less transparent, and harder to regulate. Business is booming, and the financial rewards are massive.
Using AI to catch people’s attention
It may be no surprise to you that the same techniques that are used to sell you products are also used to get your vote. But you might be surprised how many digital methods are being used to do it, since they’re so hard to recognize. Information can be gathered from the terms you put into a search engine like Google or Bing. With tools like ChatGPT, even more advanced and detailed search terms could provide more insights into who you are and what you care about.
Businesses that work in political influence have been developing new techniques to sway voters, raise money and get out the vote, and now with AI, there’s a lot more to optimize. The “influence industry” has countless methods of AI-powered persuasion tools that become more effective and numerous over time, reaching millions of people. By exposing yourself to and becoming familiar with a few of the methods, you may be able to better understand how the influence industry does this work on a mass scale.
The methods of political persuasion change and grow as technologies advance. Here are just a few of the methods for you to explore.
AI may be used to create content targeted at you: Rather than reaching large groups of people based on general commonalities, AI-powered tools make it possible to automatically micro-target individuals in ways that are the most relatable to each person.
AI may be used to do the work of political parties: Some companies are advertising AI tools to political parties to write speeches for candidates, messages to voters, and even their entire campaign strategies. While this can make producing messages more “efficient,” it also raises some concerns, such as whether there is human oversight of these new tools and the potential for interference.
AI may be used to make a political candidate look more attractive, young, and fun: “Digital avatars” of a candidate can be generated with just a small sample of their voice and image, and then can be used to create tailored messages that address a potential voter by name, or in their native language or dialect. These AI-generated messages can be so convincing that they blur the line between the “real” politician and their avatar.
What do you think about politicians using AI to do their work and reach voters? What kind of information or labels would you want to help you know what’s real and what is the work of AI?
Persu(ai)sive performances
Political candidates or their teams using social media platforms and AI tools in their campaigns is not just a futuristic vision — it is a current practice. 2024 was a big year for politics, with many elections around the world, and the use of AI in political campaigns made headlines. In 2024 we saw several instances.
A US congressional candidate posted a video on his social media that sounded like the late Dr. Martin Luther King Jr. was endorsing him – but it was an AI-generated voice clone, and it received backlash. AI-generated images of popular U.S. singer Taylor Swift supporting Donald Trump’s bid for the presidency were shared by Trump himself. An AI-generated robocall in the US impersonating US President Joe Biden advised thousands of voters against voting in the primary elections. This caused a great deal of confusion until it was revealed to be a hoax.
Although Pakistani politician Imran Khan was in prison and barred from running for office, his campaign team was able to use old footage and voice-cloned audio to make it seem like he was giving speeches. The videos were labeled “authorized AI voice.” It isn’t clear whether he had a hand in writing or reviewing the speeches. In Indonesia’s 2024 elections, the use of AI-generated digital avatars took center stage, especially in capturing the attention of young voters. Candidate Prabowo Subianto used a cute digital avatar across social media, including TikTok, and was able to completely rebrand his public image and win the presidency; you’d never know he had been accused of committing major human rights abuses.
These are just a few examples of how AI is shaping political influence around the world. Has your local politician used AI in their campaigns?
Look behind the curtain
Just because content created by generative AI looks and sounds realistic, that doesn’t mean that it is. If you see something online or in your feed that’s shocking, strange, or especially out of the ordinary, it’s possible that generative AI tools may have been used to create or tamper with it, even if it’s hard to tell with your naked eye.
Online images, videos, and texts that make you feel intense emotions like fear, disgust, awe, anger, and anxiety are most likely to go viral. This highly emotional content is also an effective way to get clicks and spread misinformation — and AI tools can help boost that virality. Pay attention to your reactions and take these feelings as a hint that you need more time to verify if what you’re seeing or reading is legitimate. For example, if you see a video of a political candidate doing or saying something that raises your alarm bells, do some extra research to see if it’s authentic or whether it may have already been debunked as AI-generated misinformation.
You can rely on certain global bodies, such as the International Fact-Checking Network, to find out which sources take extra care to verify the information they publish. On the Signatories page, search for your country to see which sources made the list.





