We live in an era where politics is no longer played only in Congress, television or the street. Today, much of the battle is fought on terrain invisible to many: social networks and the algorithms that govern them.
But let’s not fool ourselves: information manipulation is not a new phenomenon. What has changed is the scale, speed and precision with which it can now be executed.
📜 Disinformation: a weapon as old as politics
History is full of examples where information was used as an instrument of power:
Roman Empire: Julius Caesar and his political rivals used pamphlets and rumors to win the favor of the people. The narrative was as important as military victory.
Middle Ages: the Church and monarchies controlled what could be read and published. The printing press, invented by Gutenberg in the 15th century, multiplied that capacity. It allowed the spread of the Protestant Reformation, but also the Counter-Reformation, demonstrating that each revolution in information opens both paths of freedom and manipulation.
Yellow press in the 19th century: sensationalist newspapers in the U.S. manipulated facts to push public opinion toward war with Spain in 1898. The phrase “You furnish the pictures and I’ll furnish the war” (attrib.) shows to what extent the press could shape reality.
20th Century:
- In Nazi Germany, Joseph Goebbels, minister of propaganda, showed how mass media could shape the perception of an entire nation.
- During the Cold War, both the United States and the Soviet Union built propaganda machines to install narratives and weaken the adversary.
In each era, information was power. And those who knew how to control it, also controlled the masses.
🤖 What changed with technology
Today the difference is not in the intention (convince, manipulate, dominate), but in the tools:
- Speed: before a rumor could take days to establish itself; today, a hashtag can reach millions in minutes.
- Scale: past propaganda depended on printing presses or radios; today, bot farms distributed in the cloud are enough.
- Precision: before messages were general; today, thanks to algorithmic segmentation, each person can receive a political message tailored to their psychological profile.
In other words: politics went from speeches in the public square to algorithms that decide what we see in our feed.
🛠️ How digital manipulation works (simply)
Bots: programs that simulate being real users to give “likes”, share messages or attack rivals. One person can manage thousands of automated accounts.
Fake account farms: physical offices, like the famous “troll farms” in Russia or the Philippines, where dozens of operators control hundreds of profiles at once.
Algorithmic segmentation: with data on age, location, interests and even mood, political messages are personalized. Example: an undecided voter is shown a “moderate” speech; a radical voter, a harder one.
Deepfakes and manipulated content: videos created with AI that put words in someone’s mouth who never said them. The problem is not only that they deceive, but that they generate generalized distrust: we no longer know what is real.
Recommendation algorithms: platforms reward controversial content because it generates more interaction. This means that false and emotional news spreads faster than true ones.
The result: noise, confusion and the illusion of massive support, even when it doesn’t exist.
🌎 Recent examples in the world
Chile: José Antonio Kast has been accused of using bot networks to attack his rivals and amplify his image. Although the evidence has been debated, academic reports have detected anomalous patterns in digital campaigns.
U.S.: Donald Trump’s case is paradigmatic. His 2016 campaign benefited from the Cambridge Analytica scandal, where data from millions of Facebook users was used to build psychological profiles and send personalized political messages. It was the first major demonstration that “data is the new oil” in politics.
Brazil: Jair Bolsonaro built part of his movement through WhatsApp and social networks, where chains of false news circulated without control. The phenomenon showed how closed platforms can also be weapons of manipulation.
India: one of the largest and most active countries on social networks. Political campaigns rely on digital armies to drive trends on Twitter (now X) and spread nationalist messages.
Israel and Palestine: in a context of war, information has become another weapon. TikTok, X and Meta have been accused of amplifying propaganda disguised as citizen opinion.
Russia and Ukraine: Russia has used troll farms to influence foreign elections and the war narrative. Disinformation is no longer limited to its territory: it seeks to shape world opinion.
🏛️ What governments do
Each country responds differently:
- Europe: with the AI Act and the Digital Services Act, it seeks to regulate AI and force platforms to be more transparent.
- U.S.: Congress interrogates tech giants, but unified regulation is still lacking. The influence of lobbying is strong.
- Latin America: oversight is weak, which leaves room for uncontrolled digital campaigns. This makes the region fertile ground for manipulation.
- China: opposite model: total state control of information, where official propaganda dominates and digital dissent is censored at the root.
⚖️ The challenge ahead
The big question is no longer whether AI will be in politics, but how we ensure it doesn’t destroy trust in democracy.
Some steps are urgent:
- Transparency: platforms must show who finances political ads and how their algorithms work.
- Digital education: citizens need tools to detect manipulation and distinguish reliable sources.
- Global regulation: without international agreements, legal gaps allow digital manipulation to expand without limits.
- Ethics in AI use: parties and governments must commit not to use these tools to deceive, even though the temptation is great.
🗣️ Final reflection
From Julius Caesar to deepfakes, history repeats the same lesson: information is power.
What changed was the battlefield. No longer pamphlets or radios, but invisible algorithms that decide what we see and what we believe.
What’s at stake is not just who wins an election, but the quality of democracy itself. In a future governed by data and algorithms, we must ensure they work in favor of citizens and not a few with more power or money.
📚 Articles in English
“Gauging the AI Threat to Free and Fair Elections” (Brennan Center, March 2025) Explores how AI-driven disinformation —such as deepfakes and automated campaigns— threatens elections, and what measures society proposes to protect democracy. Full URL
“Bots, buzzers and AI-driven campaigning distort democracy” (East Asia Forum, July 2025) Analyzes the use of bots and AI tools in political campaigns in Asia, and how governments are implementing regulation to reduce harm and increase transparency. Full URL
“Russia seeds chatbots with lies. Any bad actor could game AI the same way.” (The Washington Post, April 2025) Reveals how Russia feeds chatbots with disinformation for something known as “LLM grooming” and how this can corrupt the informational foundation of AIs. Full URL
Atrévete a imaginar, crear y transformar.