Joe Kawly is the host of MBN’s podcast The Diplomat, covering the global contest for power and influence.
Artificial Intelligence will steal your judgment before it takes your job. The world’s next elections won’t be lost at the ballot box but in the invisible war for your attention.
When I recently spoke with Jonathan Katz, a fellow in governance studies at the Brookings Institution in Washington DC, who studies corruption and democratic decline, he said something that stopped me cold: “AI will be the most poisonous tool ever used against democracy.” He wasn’t exaggerating. What’s coming isn’t another wave of fake news or foreign interference. It’s the collapse of trust itself.
A decade ago, the main danger came from trolls and memes In 2020, the primary threats involved disinformation networks and deepfakes of politicians saying things they never said. But 2025 is different. The lies won’t just come from people trying to manipulate us. They’ll come from machines, generated by algorithms that can imitate anyone, anywhere, instantly.
We’ve already seen glimpses of this future. Last year’s presidential campaign in Ireland was shaken by a deepfake video showing then-candidate Catherine Connolly announcing her withdrawal from the race. Within hours, it spread across social media. Newsrooms paused coverage to authenticate it, and that hesitation itself shaped the narrative. Connolly hadn’t withdrawn, but the damage was done before the truth caught up, a preview of what happens when verification can’t keep pace with velocity.
Democracy doesn’t run on efficiency or even transparency. It runs on faith – the belief that truth exists and can be agreed upon. Once that faith collapses, facts become negotiable and power becomes self-justifying. As Francis Fukuyama, political scientist and international relations scholar, told me in a recent conversation: “Democracy depends on legitimacy, not just elections but belief in the system itself.” AI erodes that belief because it industrializes doubt. It doesn’t just spread falsehoods faster; it manufactures entire realities using our own fears and faces as raw material.
Yet most governments are still fighting the wrong battle. They treat AI as a regulatory issue, a matter of copyright, privacy, and national security. Those do matter, yes. But they miss the core problem. What’s at stake is not intellectual property. It’s the boundary between truth and illusion. That line is already blurring. As Ben Buchanan and Andrew Imbrie write in The New Fire: War, Peace, and Democracy in the Age of AI (published in 2022), the real contest isn’t just over data or computing power, but over “the control of perception itself.” They argue that artificial intelligence is redefining what power looks like in democracies and autocracies alike. That warning has now moved from theory to reality.
As Kenneth Pollack, former CIA analyst and Vice President for Policy at the Middle East Institute, recently told me: “Authoritarians have learned that control of information is more durable than control of territory.” In an age where perception can be weaponized, legitimacy itself becomes a battlefield.
Pollack has seen this play out across regions. “You see it in Russia, in China, in parts of the Middle East,” he said. “The leaders who master the story don’t need to master the truth.” AI makes that lesson infinitely scalable.
What worries me most isn’t chaos. It’s fatigue – the quiet, collective shrug that follows too many lies and too much noise. Katz put it best: “The death of democracy won’t look like a coup. It’ll look like a shrug.” People aren’t rejecting democracy because they dislike it. They’re withdrawing because the truth feels impossible to find. And that’s the environment where manipulation thrives. The worst will come not when citizens are angry, but when they’re numb.
Democracies have survived demagogues, disinformation, even war. They can survive AI too, but only if citizens recognize the threat for what it is: technological in form, but moral at its core. Those who would defend democracy need to recognize the magnitude of the challenge.
Journalists must find ways to adapt their verification tools as quickly as bad actors evolve their techniques for deception. Governments must treat information integrity as a form of national defense. The National Democratic Institute and the World Economic Forum argue that defending against disinformation in all its forms should be considered a core pillar of national resilience, on par with traditional defense priorities. (The Nordic countries, who were early to this realization, have shown how this approach can be transformed into concrete policies.) And citizens, each of us, must reclaim the discipline democracy relies on to question, to verify, to resist being seduced by what we wish were true.
If we can’t tell what’s real anymore, the winners won’t be the most persuasive or principled, but the best at lying. The next election won’t be stolen in a backroom or on a battlefield. It’ll be stolen in plain sight, one algorithmic illusion at a time.

Joe Kawly
Joe Kawly is a veteran global affairs journalist with over two decades of frontline reporting across Washington, D.C. and the Middle East. A CNN Journalism Fellow and Georgetown University graduate, his work focuses on U.S. foreign policy, Arab world politics, and diplomacy. With deep regional insight and narrative clarity, Joe focuses on making complex global dynamics clear, human, and relevant.


