Kingston Police warn of new AI-based phishing scams

Kingston Police Headquarters on Division Street. Photo by First Response Media.

Kingston Police have issued a release warning the public of voice phishing (vishing) incidents, created using AI (artificial intelligence.)

“It’s no secret that cybercriminals are using AI technology to craft phishing emails, but did you know AI can also help them with voice phishing (vishing)?” Kingston Police asked.

“It’s surprisingly easy to teach AI software to sound like a specific person. All they need to recreate your voice is a short audio clip, like one from a recorded phone call or a video posted to social media. Once the cybercriminals have your voice, they can easily target friends, family members, and coworkers with AI-powered vishing.”

In the release, police noted that cybercriminals often use this tactic to impersonate managers and executives of an organization.

“In this scam, you receive an unexpected call from somebody in upper management asking you to help with an urgent project. The voice will direct you to wire money to a vendor in order to meet a looming deadline. Of course, if you follow their directions, you’ll actually be wiring money to the cybercriminals instead,” police explained.

Follow these tips, offered by police in the release, to stay safe from AI-powered vishing attacks:

  • When you receive an unexpected message, contact the person using a reliable source. You can use a phone number you have on file, an official email address, or a messaging system like Teams.
  • If you’re speaking to the caller directly, ask questions that would be difficult for an imposter to answer correctly.
  • Even if the request is urgent, stop and think before you take action. Ask yourself questions like: Is this something in my job description? Or is there a procedure this person should follow?

Leave a Reply

You cannot copy content from this page, please share the link instead!