top of page

AI Deepfake Scam Tricks Woman Into Believing She Was Dating General Hospital Star Steve Burton

Updated: 2 days ago

Man with short hair looks serious. He's wearing a black shirt. Blurry background with vibrant red flowers and soft lighting.

In a shocking tale that highlights the dangers of artificial intelligence abuse, a Southern California woman was scammed out of more than $80,000 by criminals using AI-generated videos and voice cloning to impersonate General Hospital actor Steve Burton. This unsettling incident has become a cautionary example of how deepfake technology is being weaponized to manipulate and exploit vulnerable individuals online.


AI Scammers Use Steve Burton Deepfake to Launch Elaborate Romance Scam

The scam began in October 2024 when Abigail Ruvalcaba was contacted on Facebook Messenger by someone claiming to be Steve Burton. The scammer used hyper-realistic AI-generated videos and voice technology to create a convincing version of the popular soap star. Abigail, a longtime fan of the General Hospital actor, believed she was in a private romantic relationship with Burton.


The impersonator gradually built trust and affection, eventually claiming to be in financial distress due to California wildfires. Using emotional manipulation, they convinced Abigail to send money through various methods including cash, gift cards, and cryptocurrency.



Victim Loses Life Savings in Heartbreaking AI Scam

Over several months, Abigail sent more than $81,000 to the scammer. In a tragic turn, she also sold her family’s Harbor City condo, forwarding an additional $350,000 to the fraudster. By the time her daughter intervened, Abigail was in deep financial trouble, facing potential bankruptcy and homelessness.


The scam was so convincing that even Steve Burton himself admitted the fake videos sounded exactly like him. He issued a public warning stating he would never ask fans for money, encouraging everyone to stay vigilant.


What This Scam Reveals About the Future of AI Crime

This case underscores the rapidly growing threat of AI-driven deepfake scams. Experts warn that with the rise of easily accessible AI tools, creating realistic impersonations takes as little as 15 minutes. As these tools become more sophisticated, so do the scams.


Red flags include unsolicited celebrity messages, urgent financial requests, and quick transitions to encrypted messaging apps like WhatsApp or Telegram. Cybersecurity experts urge users to verify identities and never send money to individuals met online without thorough vetting.


Stay Safe in the Age of AI Scams

The heartbreaking story of a woman scammed by an AI version of General Hospital actor Steve Burton is a wake-up call. As artificial intelligence becomes more integrated into everyday technology, so does the potential for misuse. Stay informed, verify suspicious claims, and educate loved ones about the risks.


If you or someone you know suspects an AI impersonation scam, report it to local authorities and online platforms immediately. Let this story be a powerful reminder of how important digital literacy and caution are in the AI age.

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page