Over the weekend, Dr. Tony Youn, a plastic surgeon in Troy, Michigan, who is known for his amazing social media presence, posted a shocking video exposing how artificial intelligence (AI) was used to create a deepfake of him promoting products he had never heard of. He isn’t the first medical professional to be victimized by AI deepfakes, and it appears he won’t be the last!
In his post, he made it clear: “…AI is crazy nowadays. Please do not buy any of these products from these scammers! I do brand deals occasionally…but they will be from this account. Please report any obvious scams…”
This alarming incident highlights a growing problem in aesthetic medicine and beyond: AI deepfakes are being used to exploit the likenesses of trusted medical professionals, misleading the public and damaging reputations.
How Deepfakes Are Targeting Plastic Surgeons and Medical Professionals
Deepfake technology uses AI to manipulate videos, images, and voices, making it appear as though someone is endorsing a product or service they have never actually supported. In Dr. Youn’s case, scammers used AI to make it seem as if he was promoting a feminine hygiene product—something entirely unrelated to his actual work as a plastic surgeon.
Plastic surgeons and medical doctors are prime targets for these scams because they are trusted authorities in health and aesthetics. People will use their likeness to give credibility to their health, wellness, and skincare products, fooling hundreds and even thousands of people into purchasing their products online. This can hurt your followers, your patients, and others.
Fraudulent AI-generated endorsements can deceive patients, erode trust, and damage a doctor’s online reputation if not responded to properly.
Your true brand deals can also experience negative consequences as social media users become more suspicious of online videos. That brand endorsement you just got with that amazing product you love could see a drop in sales because your followers don’t trust what they see by default.
This speaks to a larger problem of consumer trust and reliability of internet content in general, which is a tangent we won’t go down today.
The Growing Threat of AI Deepfakes in Aesthetic Marketing
AI deepfakes are getting more and more convincing. While it’s still relatively easy to spot a fake video, fake images and audio are incredibly convincing and scammers can use your likeness in other, less sophisticated ways. Here are some of the ways deepfakes hurt you and your fans and patients.
- Fake Endorsements – AI-generated videos and voiceovers are being used to falsely associate doctors with beauty and wellness products they have never approved, leading patients and followers to waste money on bogus purchases.
- Misinformation & Scams – Patients may fall for AI-generated ads, taking health advice that a physician would never give to anyone, such as avoiding sunscreen.
- Reputation Damage – Having your name and likeness misused can lead to confusion, distrust, and potential legal battles to clear your name. They can also prevent you from making good use of true brand deals in the future as your audience becomes primed to suspect scams.
How Plastic Surgeons and Digital Marketers Can Combat Deepfake Scams
There is no way to 100% protect yourself from being turned into an AI avatar for a scammer’s use. If your photos and videos exist on the internet, they can be used by unethical AI image and video creators. If your voice is out there, they can put it through their technology and spoof your voice. However, there are some steps to take.
- Monitor Your Online Presence: Regularly search for your name on social media and Google to catch any fake content early. You can set up a reputation monitor to alert you when you’re mentioned online. PR companies can do this for you.
- Educate Your Audience: Follow Dr. Youn’s lead—warn your followers about AI scams and clarify where they can find legitimate information about your practice and brand. You can do this before you’re ever victimized by AI and get ahead of it, too.
- Report and Take Legal Action: Report deepfake scams to social media platforms, the Federal Trade Commission (FTC), and legal professionals if necessary.
- Use Watermarks: Digital marketers can help by adding watermarks to your content to make it harder for scammers to steal and replicate, but this will only get you so far.
- Stay Informed on AI Regulations: As AI regulations evolve, staying informed will help protect doctors and their brands from the unethical use of this technology.
Final Thoughts on AI, Deepfakes, and Protecting Your Reputation Online
AI deepfakes are more than just an online nuisance, they’re a serious threat to medical professionals and their patients. Dr. Tony Youn’s experience is yet another wake-up call for plastic surgeons, dermatologists, and other healthcare providers to take proactive steps in safeguarding their reputations. Digital marketers must also play a role in identifying, preventing, and reporting fraudulent content to protect both practitioners and consumers.
Have you or someone you know been targeted by AI deepfakes? Have you been able to spot deepfakes in action? Let us know!
Looking for help with digital marketing for your practice? Contact us.