By Cindy Immonen, NTP, CLTP The FBI reports that direct financial losses caused by business email compromise (BEC) and email account compromise (EAC) has surpassed $12.5 billion worldwide! The frequency of these attacks continues to grow at staggering rates, impacting more than 90 percent of organizations, and affecting nearly 1 percent of all emails sent. More alarming still, the level of sophistication behind these attacks continues to rise, making phishing emails nearly undistinguishable to novices and IT professionals alike.
Fraudster impersonations already steal millions from consumers each year. But what if the fraudster could use technology to clone the voice of a real person? YIKES! Sophisticated fraudulent techniques have previously included:
The Federal Trade Commission examined voice cloning technologies that enable users to make near-perfect reproductions of a real person’s voice. Advances in artificial intelligence and text-to-speech (TTS) synthesis have allowed researchers to create a near-perfect voice clone with less than a five second sample recording of a person’s voice.
According to Laura DeMartino, associate director in the FTC’s division of litigation technology and analysis, “People have been mimicking voices for years. But in the last few years, the technology has advanced to the point where we can clone voices using a very small audio sample.” Voice cloning (or Deepfakes) techniques, that generate near-perfect reproductions of a person’s voice, will let criminals communicate anonymously, making it much easier to pull off scams. In the past it’s been difficult to convincingly pose as someone else, but with Deepfakes, you can communicate anonymously with people anywhere in the world. One example of such a successful scam was an unnamed UK-based energy firm that was extorted for about $243,000 because someone spoofed the CEO’s voice for a wire transfer scam. YIKES!!!
The FTC has examined technology using speech synthesis that uses the voice of an actual person, and expressed their ethical concerns related to the use of this cloned voice technology. This voice cloning technology is being developed and deployed for healthcare and consumer-oriented applications (customer service, entertainment, etc.). Some of the most impressive examples, to date, are from a Toronto-based research firm Dessa and Facebook, where they created successful awareness campaigns.
But, it is also being used for fraudulent schemes! FTC examiners have stated that Deepfakes voices will be and already have been used without consent. The cybersecurity firm Symantec said last September it had come across at least three cases of Deepfakes voice fraud. Experian predicts that criminals this year will use all forms of artificial intelligence to disrupt commercial enterprise operations and create “confusion” among nations, using whatever tools enable them to achieve their objectives.
While the cheap technology is imperfect, and some of the faked voices wouldn’t fool a listener in a calm, collected environment, are we all in a calm, collected environment? Thieves depend on age-old scam tactics to boost their effectiveness. They most likely will use time pressure, such as an impending deadline. In some cases, criminals have targeted the financial gatekeepers in company accounting or budget departments, knowing they may have the capability to send money instantly.
So, Stop, Pause, Question, and follow safe communication of financial information practices. Here are some reminders and additional tips: