Dr. Michał Sutkowski fell victim to fraudsters who advertised pseudocardiac drugs and preparations supposedly fighting parasites. He notified the prosecutor’s office about the crime.
The Sowisło Topolewski Law Firm of Attorneys and Legal Advisors SKA reported the fraud using the image of Dr. Michał Sutkowski. The announcement describes exactly how the fraudsters operated.
Scammers advertise preparations with the image of Dr. Sutkowski
“Dr. Sutkowski was a victim of image theft and his face appeared in advertisements for pseudocardiac drugs called Acardin and alleged preparations for parasites. Using the doctor’s face, fictitious interviews were created based on the so-called deep fake” – we read.
“Fraudsters whose identity has not yet been established have created websites that are confusingly similar to those run by the editorial offices of TVP Info and TVN. Moreover, these websites contain fictitious interviews allegedly conducted with Dr. Sutkowski,” the doctor’s lawyers further inform.
Dr. Sutkowski reminds: Do not take medications without medical supervision
Despite reporting a suspected crime, false advertisements are still spread on social media. The law firm representing Dr. Sutkowski warns that they may pose a threat to the health and life of people who are deceived in this way.
“It is particularly important to increase public awareness of this type of fraudulent practices. Taking preparations of unknown origin advertised on the Internet, without medical supervision, endangers the health and life of patients,” the statement emphasized.
Dr. Michał Sutkowski himself also posted a separate statement. On his YouTube channel, he argued that he did not know the products advertised with his face. He reminded that, according to the law, no doctor can advertise similar “medicines”. – Ladies and gentlemen, this is image theft. Image theft, of which I am a victim, he emphasized. He warned viewers about the dangers of trusting this type of advertising.
What is deep fake technology?
Deepfake is an image manipulation method that uses artificial intelligence and machine learning to overlay moving and still images onto source images or videos. As a result, for example, videos may be created with the participation of any person who had nothing to do with the recording, and viewers will not be able to distinguish the original recording from a fake one.
It is impossible to say unequivocally that this is something exclusively bad. On the one hand, materials created using this technology may have entertainment or educational value – for example, a museum will be able to produce a film “featuring” a deceased artist who will tell visitors about the secrets of art.
However, it is also, and perhaps above all, a great tool for all kinds of cybercriminals who want to create fake news and videos and photos that defame other people. Such recordings can be used to blackmail someone or destroy their reputation.