Nigerian elections used to be decided by who had more buses, more money, or more muscle. Now, it’s about who can flood social media first. You don’t need thugs to intimidate anymore; you just need a convincing deepfake and a few paid amplifiers.
Across Sub-Saharan Africa, AI is already rewiring the incentive landscape of elections.
I study how politicians rewire power when institutions are weak, why they defect, why they fight, why they risk violence when it seems like the best option. What I’m seeing now is that fundamental change, where AI propaganda is influencing how politicians behave come election-time. Violence doesn’t need guns anymore, it just needs bandwidth.
The New Incentive Landscape
Rational choice theory tells us politicians act when the expected payoff outweighs the cost. Traditionally, violence was a last resort: high-risk, high-visibility. But AI propaganda is beginning to change that process.
Disinformation used to require money, manpower, and time. Now, it’s as easy as prompting a model. Deepfakes collapse the cost of deception. Synthetic voices and images spread faster than any physical campaign. A well-timed fake video can tank credibility, incite panic, or demobilize a voter base, all without a single bullet being fired or a ballot box being snatched.
One of the major themes of my doctoral studies has been how political defections reshape the geography of violence in Nigeria. When elected politicians abandon old alliances and form new ones, how do these shifts translate on the ground during elections? For example, my research has found that Nigerian governors and senators defect from ruling political parties into smaller, institutionally weaker parties. This allows them to gain leverage and engage in dealings with private sector actors wary of the central government. This process hinges on securing electoral victory, the gateway into elite power structures and resources. Unfortunately, Nigeria and many other African states have also experienced devastating, lethal instances of electoral violence due to these power dynamics.
What AI propaganda introduces into the mix is another cost-effective tool to manipulate electoral outcomes. Politicians no longer have to buy loyalty through patronage or intimidation, they can now manufacture it.
Violence Without Blood
The real danger isn’t that AI will tell lies. It’s that it will tell them plausibly. When everyone can fake everything, credibility becomes just another weapon.
As DUBAWA recently reported in “AI-generated disinformation looms ahead of Nigeria’s 2027 elections” (Oct 21 2025), deepfakes and synthetic media have already entered the electoral landscape, as recently as the 2023 Nigerian elections. AI-generated imagery portrayed acts of violence against presidential candidates, with the most likely reason being to stir up tension and instability in an already fragile political society.
In systems where institutions are already struggling, this is revolutionary. Disinformation creates a fog of politics, uncertainty so dense that it paralyzes action. People stop trusting journalists, then elections, then each other. It’s not open violence, but it creates the same outcome: fear, withdrawal, and disengagement.
For a political actor, that’s perfect. Chaos becomes strategic. When truth collapses, the one who controls confusion controls the field.
Rational Choice in the Algorithmic Age
The heart of rational choice theory is incentives, and AI is rewriting them.
When propaganda can be automated, the marginal cost of destabilizing a community drops to almost zero. That means more actors, more attempts, more chaos. It’s not that every politician suddenly becomes malicious; it’s that the equilibrium shifts. The rational decision in an AI-saturated environment might now be to destabilize, rather than stabilize.
AI doesn’t replace violence. It makes inciting violence (informational, reputational, psychological) strategically viable. This shift is already warping voter behavior across many African democracies, where disenfranchisement runs deep..
That’s why AI governance can’t just be about regulating data or training sets. It has to be about redefining political rationality, understanding how actors will exploit this new terrain of incentives.
What Comes Next?
While AI has proven itself to be a powerful tool in nearly every sector, we still need to discuss the drawbacks and its unknown potential. The next Nigerian elections are set for 2027, and politicians' behavior and their use of AI will be under the magnifying glass. Until then, monitoring the use of AI propaganda in African elections and the globe as a whole will be crucial in the efforts of understanding its effects on governance and stability in vulnerable democracies.



