Technology becomes dangerous when trust is its raw material.
Warsaw, January 2026.
A video that appeared to show Poland’s president promoting a Bitcoin investment platform spread rapidly across social networks and messaging apps. Within hours it attracted thousands of views, comments and private messages from users asking how to invest. The problem was simple and severe: the video was false. It was a deepfake, generated through artificial intelligence using fragments of real footage and synthetic voice technology to fabricate a message that the president never delivered.
The original footage used in the manipulation came from a public address focused on constitutional powers and veto authority. In the deepfake version, the words were replaced with an artificial script praising a trading platform and promising automated profits. Visual analysts identified inconsistencies between lip movement and sound, along with digital artifacts typical of AI-generated video. Fact-checking groups in Poland confirmed that neither the president nor his office had any connection to the promoted platform. What looked authentic was, in fact, a carefully engineered illusion.
According to Europol, financial fraud linked to digital manipulation has risen sharply in the past two years. Europol has warned that criminals increasingly combine cybercrime with social engineering, using faces of authority to bypass skepticism. In this case, the image of a sitting president was used as a credibility shortcut. Viewers were not asked to trust a stranger. They were asked to trust the state, or at least its most visible symbol.
The financial dimension of the scam follows a known pattern. Victims are directed to a polished website, asked to register quickly, and encouraged to deposit small sums that appear to grow. Once larger deposits are made, access disappears. Consumer protection agencies in several European countries report that losses from fake crypto platforms now reach hundreds of millions of euros annually. The Polish case fits into a continental trend rather than an isolated incident.
From Asia, analysts at the Singapore-based Centre for Strategic Futures have noted that deepfakes are becoming cheaper and faster to produce. What once required expert studios can now be done with commercial software and minimal training. They warn that political figures are especially vulnerable because abundant public footage makes model training easy. The more visible a leader is, the easier it becomes to imitate them.
In North America, the Federal Bureau of Investigation has also warned that criminals are using voice cloning and synthetic video to impersonate executives, officials and even family members. The FBI has linked several large fraud cases to AI-generated impersonation. The technique relies less on perfect realism and more on speed. By the time doubt appears, money is often already gone.
The Polish deepfake also has political implications. When citizens see a leader apparently endorsing risky financial schemes, trust in institutions erodes. Even after debunking, the emotional impact remains. Some viewers remember the image but forget the correction. Disinformation does not need to convince everyone. It only needs to confuse enough people to weaken confidence.
According to the Organization for Economic Cooperation and Development, trust is a central asset in modern governance. When public communication becomes unreliable, cooperation with policy collapses. Deepfakes therefore do not only steal money. They steal certainty. They turn every video into a question mark and every statement into a possible trick.
Poland is not alone. Similar deepfake scams have used the faces of presidents, prime ministers, central bank heads and celebrities across Europe, Africa and Latin America. In some African countries, manipulated videos of finance ministers were used to promote fake gold and crypto schemes. In parts of Latin America, mayors and governors were impersonated to solicit donations for disasters that never happened. The pattern is global, even if the faces are local.
Regulators are struggling to keep up. The European Commission has proposed rules on artificial intelligence that include transparency and labeling obligations. However, enforcement remains slow and technical detection tools lag behind creative manipulation. Platforms remove content once identified, but the first wave of damage often cannot be reversed.
Technology companies argue that they are investing in detection systems. Some propose digital watermarking for authentic video. Others promote content origin tracking. Yet experts warn that no technical solution will be enough by itself. Criminals adapt faster than rules, and software evolves faster than law.
Education therefore becomes critical. Consumer protection agencies now advise citizens to treat any unexpected investment offer with suspicion, especially if it uses famous faces. Banks and financial regulators emphasize that no public official will ever promote a private trading platform. If authority appears to sell profit, it is almost always a lie.
The deeper danger is cultural. When people stop believing what they see, they may also stop believing what is true. Deepfakes do not only create false messages. They create a world where all messages are questionable. That uncertainty benefits criminals, extremists and manipulators.
In Poland, authorities moved quickly to deny the video and warn the public. Yet they also admitted that damage was already done. Some citizens had already deposited money. Others had shared the clip in good faith. The state could correct the record, but it could not rewind emotion.
This episode shows how artificial intelligence has shifted from tool to weapon. Not a weapon of war, but a weapon of trust. It attacks the bond between image and reality, between voice and truth. And once that bond is weakened, every society becomes easier to deceive.
The question is no longer whether deepfakes will be used for crime. That is already settled. The real question is whether institutions, platforms and citizens can adapt faster than those who exploit them. If they cannot, the next target will not just be money. It will be belief itself.
La verdad es estructura, no ruido. / Truth is structure, not noise.