Romance scams used to feel like a cliché. Everyone pictured an email from an overseas “prince” that was poorly written and full of typos and pleas for cash. Now, that cliché is dead.
Today’s romance scams are industrial-scale operations. Attackers use artificial intelligence to clone voices, create deepfake video calls, and write scripts with large language models (LLMs).
In 2024 alone, the Federal Trade Commission reported that financial losses to romance scams skyrocketed, with victims losing $1.14 billion. The real number, hidden by shame and silence, is likely triple that.
Romance scams aren’t just a tragedy for the victims. A successful scam is a massive risk for businesses, too. When an employee with access to sensitive data or funds is compromised, the “heartbreak hack” can harm an entire organization.
What Today’s Romance Scams Look Like
Phase 1: Contact. Romance scams often start on dating apps—but they’re also prevalent on Instagram, Facebook, and LinkedIn—with a seemingly innocent message. These scams aren’t necessarily about love; they’re about establishing trust.
For example: “Is this Alex? We met at the conference last week,” or “Sorry, wrong number, but your profile photo is lovely.”
The goal is to continue the conversation on an encrypted app, such as Telegram or WhatsApp, where traditional security measures can’t monitor conversations. Once contact is established, the manipulation becomes emotional.
Phase 2: Love bomb. Over weeks or months, the scammer builds intimacy. They’ll share mundane details, such as photos of their dog or personal struggles.
But with today’s AI upgrade, LLMs can craft empathetic responses that mimic shared information to gain trust. Eventually, the relationship is leveraged for financial gain.
Phase 3: Pivot. Once trust is established, the conversation pivots. The scammer doesn’t ask for a plane ticket or emergency money. They talk about success.
They might say, “My uncle has an exclusive crypto trading algorithm.” They’ll agree to ‘teach’ the victim how to invest, showing massive (yet fake) returns on a legitimate-looking app. Then, the victim invests large sums of money.
What makes these scams especially dangerous is that old warning signs no longer apply.
When the Bot Flirts Back
We used to say, “If they won’t video call you, it’s a scam.” That advice is now obsolete.
In deepfake video calls, for example, scammers use real-time face-swapping technology. On your screen, the person moves, blinks, and smiles, wearing the face of the stolen identity. While the tech is good, it’s not perfect. Tip: Look for blurring around the neck and hairline or glitches when they pass a hand in front of their face.
In voice cloning, scammers send voice notes that sound exactly like the person in the photos. Free AI tools now require less than 10 seconds of audio to clone a voice with 85% accuracy, enabling voicemails that reinforce the persona’s reality.
Organizations Need to Pay Attention
You might be thinking, “Why is it a CISO’s problem?”
Take the now-former CEO of Heartland Tri-State Bank, who fell victim to such a scam. Convinced he was investing in a crypto opportunity for his “friend,” he embezzled $47 million of the bank’s funds, leading to the bank’s total collapse and a 24-year prison sentence. Had the bank’s chief information security officer known what was going on, the situation might have been identified earlier and nipped in the bud.
Here are three forms of the corporate blast radius.
- Embezzlement: Employees with access to payroll or wire transfers may “borrow” company funds, believing they’ll pay it back once their “investment” clears.
- Sextortion and blackmail: Scammers typically encourage victims to share intimate images. Once they have this, it becomes leverage.
- BYOD malware: The “trading app” the victim installs is often sophisticated malware that gives the attacker backdoor entry. If that device connects to your corporate network, the attacker is inside.
How to Stop a Romance Scam
Defending against romance scams requires recognizing patterns in infrastructure and the psychology of influence. Here are three tips to avoid falling victim to a fraudster.
- Watch for the vibe shift: If a romantic interest mentions cryptocurrency, foreign exchanges (forex), or “nodes” within the first few weeks, it’s a 100% positive indicator of a scam—no exceptions. If they’ve been patient for months, but suddenly an opportunity is closing quickly, this is manufactured urgency designed to bypass critical thinking.
- The “specific action” test: Try to hop on a video call, and take two actions. First, ask the person to turn their head all the way around. Deepfake models often struggle with extreme movements or facial expressions, and the face can glitch. Second, ask the person to wave a hand in front of or behind their head. AI often gets confused about which object is in front, leading to face distortion.
- Move beyond awareness training: Social engineering defense used to be treated as a training problem, measured by click rates and phishing simulations. But modern attacks go beyond inboxes, and they don’t wait for employee mistakes.
Today’s most damaging campaigns leverage impersonation tactics across email, messaging platforms, and social media, often targeting trusted relationships. Defense requires moving beyond reactive training toward early detection of impersonation and coordinated disruption, supported by human risk management practices that help employees recognize how attacks like romance scams begin and escalate.
Trust, But Verify
There’s now little distinction between personal life and corporate risk. When an employee or executive is emotionally compromised, so is the organization. Human intuition can’t win a fight against AI-powered psychological warfare.
The heart will always be a vulnerability, and in the age of AI, it’s also an attack vector. Romance scams prove that attackers don’t need to break a firewall; they just need to break a heart, and it’s time to defend with rigor.
