Where do they get their ideas?

I am often surprised when I hear about different fraud schemes.  Trying to figure out where fraudsters get their ideas is mind-boggling. For example, in a recent fraud scheme in Halifax, the victim would be contacted by an individual posing as a bank employee.  The victim would be told that their credit card had been compromised and a bank employee was on his way to get the card.   The victim was pressured to give the card to the “employee” or risk not getting reimbursed for the charges made on the card.  The fraudster then would use the card to commit fraud. 

This scheme is littered with red flags, yet the fraudster still convinced victims that they were legitimate bank employees.  Banks may call to tell you that your card has an issue, but banks will never send an employee to pick up the compromised card.  The pressure that was applied by the fraudster is another red flag; don’t fall for this.  If your card has been compromised, the first thing you need to do is lock it or destroy it yourself.  Do not cave into the pressure of someone to give them your card.

Some may say that this scheme happened in Halifax; it won’t happen here. But the news article ran in late May 2024. Anyone intent on committing fraud who reads the article may decide that with a few tweaks, they could get away with a similar fraud elsewhere. That’s why we need to know what schemes are being used to commit fraud so that we are aware when the same or similar schemes are used on us.

AI Voice Fraud

Artificial Intelligence (AI) schemes are on the rise. As with the example above, the increased use of AI is because fraudsters see other bad actors using AI successfully and use similar ideas. AI is being used to develop more persuasive scripts and synthetic voices to convince victims that they are authentic.  It doesn’t take a lot to mimic someone’s voice anymore, just a few small sound bytes online.  This is why it is a good idea to have a code word that only family and close friends know, and that can be used in times of trouble.  Think of it as a proof of truth word.  The call isn’t authentic if the person on the other end of the phone doesn’t know it.

It’s important to remember that not all AI schemes are aimed at seniors.  The technology that is used to mimic the sound and intonation of someone’s voice could be used in employment schemes, immigration schemes, or grandparent schemes.  The scripts may differ, but the ideas are the same. When these calls come in, we have to be skeptical. We cannot bow to the pressure that the fraudster wants us to feel.  Ask for proof that the caller is who they claim to be. By exercising caution and demanding proof, we can take control and protect ourselves from falling victim to these schemes.

Kathleen O’Donoghue, CFE

Leave a Reply

Your email address will not be published. Required fields are marked *