Courier-Enabled “Phantom Hacker” Scams
The FBI has issued a warning about a new criminal scheme that operates on the fear of hackers to trick victims into giving away thousands. The scam follows a pattern where the scammers call the victim first pretending to be a tech company, perhaps Amazon, wanting to verify activity on accounts in the victim’s name. This tells the victim that they have had their identity stolen, but it’s only phase one of the scammer’s plan. Next they call the victim pretending to be a US government official, and ratchet up the urgency and the pressure. The victim is warned that to protect their assets from digital looters they should take all their wealth out of the digital realm as much as possible, converting to gold or other precious material. Then the scammers send a courier to pick up the precious material, promising that the courier will safely deposit it somewhere nice and secure, and that the victim will get it back if they ask. Needless to say, the promised vault is like the big farm upstate where goldfish go after they’ve learned to swim upside down: a one-way vacation.
The financial advice columnist over at The Cut fell for this scam and wrote it up in detail. Her account conveys how remarkably slick and convincing the scammers can be, even with what seems, from a distance, to be an outrageous story.
The Bottom Line: No legitimate government agent will ever tell you to drain your bank accounts, then turn it over to them. Be wary of any unsolicited phone call, especially when they try to convey an extremely urgent matter. In addition to the impersonation of government agents, this scam depends on an exaggerated fear of hackers and their capabilities, rooted in Hollywood myth. While it is certainly possible for a hacker to open a new line of credit using your stolen identity, credit monitoring services will notice.
Warning: Fraudulent Activity on Your Account
The scam is fairly simple: the scammer calls to tell you that there has been suspicious activity on your bank account and they need to lock your debit card. It looks like they’re calling from your bank’s phone number. They don’t ask for your whole card number (obviously unsafe), nor do they ask for the final four digits (which would be safe to give them). Instead they ask for just enough digits to get you into trouble. Most of your debit card number is based on which type of card it is and what bank issued it, so they just need the rest. Once they’ve got your card number, they race off to an internet spending spree. It looks simple, but bonafide internet legend Cory Doctorow fell for this scam over the holidays. He then turned it into an excellent blog post that’s well worth a read since it’s both entertaining and educational.
To pull off this scam, the scammer needs to know your phone number, your name, which bank you use, what type of debit cards the bank issues (visa, mastercard, etc) and they need to be able to make their call look like it’s coming from the phone number your bank would use. That’s a lot of info and it likely takes a fair amount of effort to get it all together for each potential victim. Scammers will take the time, but they often don’t have to. A single leak of customer data from a credit union or healthcare provider may supply the scammer with everything they need.
The Bottom Line: This event emphasizes our recommendation number 2: Never give out information to someone who calls you, no matter what they say. Call them back at the official number that can be found on the company’s web site.
Romance Scams in the Air
Romance scams are responsible for $1.3 billion of grift in 2022 alone, reports the U.S. Federal Trade Commission (FTC). Scammers constantly evolve their techniques, but the FTC has analyzed 8 million romance scam reports submitted to them to produce a very helpful writeup of how romance scams look and how to avoid them. The cruel affair usually starts with a fake persona on social media, in dating apps, or any other place where you might expect to meet a stranger online. It progresses to romance, of the digital variety: playful messaging, words of kindness, details about a family and life, all fake. Human connection is the bait. The hook comes on the end of a set of fairly predictable lines. As explained in the FTC’s report, here are modern scammer’s favorite lies:
-
I or someone close to me is sick, hurt, or in jail
-
I can teach you how to invest
-
I’m in the military far away
-
I need help with an important delivery
-
We’ve never met, but let’s talk about marriage
-
I’ve come into some money or gold
-
I’m on an oil rig or ship
-
You can trust me with your private pictures
Some of these lines lead by fairly obvious paths straight to theft. We’ve talked about “I can teach you how to invest” before when we’ve discussed pig butchering. If you’re sending someone bond money because they’re “in jail” when they are not, then it’s clear how the grift works. Remember that the line doesn’t come right away: they wait until they’ve made a firm and trusting connection. Other lines are a little less obvious: pretending to be on an oil rig or ship or in the military is a prelude to eventually claiming that they don’t have access to their own bank accounts because of some semi-plausible bureaucratic mess. They’ll ask for your help buying their (fake) kid a birthday present, or paying the customs fees on a package, or any of a hundred different things somebody stuck far from home might need help with.
The private pictures are a special case. Compromisingly private pictures are used as blackmail material in a racket called sextortion, which mostly targets people under 30 on Instagram and Snapchat.
The Bottom Line: Verify identities when engaging with strangers online. A few techniques can help: search their profile pictures in google images to see if it’s been stolen from somewhere else; try a phone number lookup on their number; and most importantly, insist on a video call. Anyone requesting that you wire them money or read them the numbers off a gift card you’ve purchased is a scammer.
Deep Fakes on the Rise, Taylor Swift, Joe Biden, Your Boss
This month, generative AI was used in three high-profile scams. Sexually explicit images were generated of celebrity Taylor Swift and circulated on X (formerly twitter). There was no financial goal, just cruelty. X tried to crack down on the images by blocking searches for Taylor Swift by name, but their efforts were ineffective. AI was used to create fake audio of President Joe Biden urging voters in New Hampshire not to vote for him in the state primary. This was linked to a company in Texas, and prompted the FTC to update its regulations to outlaw phone calls using deep faked audio. Finally, in a world first, an employee of a company in Hong Kong was convinced to wire $25 million to scammers who used deep fake videos of his boss and other company employees in a video conference call to convince him the wire transfer was authorized.
The Bottom Line: AI generated deep fake images, audio, and video, are here. They work. They’re convincing. It is now possible to generate images, audio, or video of almost anyone doing almost anything, though the programs need a lot of source material to train them so it works best for public figures. Of audio, video, and images, video is the most difficult for the programs to produce, so insisting on a video call is still the best way to be certain you’re talking to a real person. In a video call, pay attention to the quality of the image. Deep fake programs still generate videos that look like they are low resolution, akin to what you would see in a Zoom call with a poor connection.