They are even impersonating Andrzej Duda. All it takes is to fall for it and lose your life savings
Andrzej Duda, Robert Lewandowski and other famous personalities appeared in videos distributed on the Internet and encouraged financial investments that were supposedly “exceptionally beneficial”. This is what deepfake technology is all about.
The images and voices of these celebrities were used in fake recordings in which technological manipulation made them appear authentic. Since the beginning of May, many new deepfake scams have emerged and are becoming more convincing every day.
What is a deepfake scam?
Frauds based on deepfake technology became public again in March 2024. Video recordings appeared on the Facebook social media platform in which the image of the president of InPost was used to promote unfair investments. Despite Rafał Brzoska's quick reaction and reporting the matter to Meta administrators, the website did not respond to the businessman's protests for many days.
In videos shared on Facebook, fraudsters used the lip-sync technique, manipulating not only the spoken content with the voice of the person in the recording, but also synchronizing lip movements with the spoken words.
– Currently, technologies allow criminals to easily manipulate audiovisual materials. In the case of “text-to-speech” technology, just a few seconds of a recorded voice are enough to create a new audio background that can be synchronized with video material, for example with any speech or political speech. However, with “speech-to-speech” technology , which takes into account intonation and emotions contained in the voice, a longer fragment of the original material is needed, about a minute – explains Ewelina Bartuzi-Trokielewicz, head of the Deepfake Analysis Team at NASK.
Since the beginning of May, many new deepfake scams have appeared on social media platforms. Criminals illegally used images and faked the voices of, among others, President Andrzej Duda, Minister of Health Izabela Leszczyna, footballer Robert Lewandowski, influencer Kamil Labudda (Buddy) and businessman Rafał Brzoska, president of InPost. All those who are known and enjoy authority in certain social groups, such as politicians, businessmen or celebrities, are at risk of image (and voice) theft.
It's hard to believe your own eyes today
Often, such false materials are presented in the form of a television broadcast template, most often as part of news programs, which is intended to further emphasize the importance of the transmitted content and increase its credibility. People often trust the information they see on TV. The use of such a procedure shows how deepfake technology can be used to create complex, convincing, although false, narratives that seem authentic at first glance.
– The ease with which fake audiovisual material can be created increases the risk of manipulation and disinformation. Therefore, it is important to increase public awareness of the development and possibilities of technologies for generating synthetic content. Social media users should be wary of video content that appears unverified or suspicious, especially if it has the potential to influence public perceptions of prominent figures and institutions – warns the NASK expert.