In late 2023, a video was circulating of billionaire Elon Musk outlining a “new secret investment.” The offer was too good to be true, offering a large, quick return on a risk free investment. In one video, it was promised that investors will “have over one million in each of their trading accounts in just six months”. This was not the case… as it was a digitally manipulated video using Elon Musk’s image and voice otherwise known as a “deepfake.”
Your photo, and 15 seconds of your voice, is all that it takes to create a deepfake video that targets you. Deepfakes use artificial intelligence paired with your online pictures, videos and voice recordings, to create a very realistic imitation of anyone. If you have frequented the internet and social media lately, you may have noticed videos of celebrities and politicians that appear real but are not. Their images are likely delivering messages that are out of character for them.
Deepfakes are a big deal, and they are also coming for investors’ identities. Scammers are turning to deepfakes of trusted public figures to take your money through bogus online ads. CBC television journalist Ian Hanomansing is one of many victims of these triggered attacks. He recently shared what policies, and resources, are currently in place and how social media companies are reacting. Below is a CBC video of Hanomansing addressing these scams. There are a few short ads before Hanomansing’s report is given, but well worth the wait. Click here and scroll down if you want to watch this four-minute video.
Proving a person’s online identity will continue to become increasingly difficult. This is especially concerning as technology becomes more accessible and easier to navigate for the “average Joe.”
As a result of these advancements in generative AI, new channels of criminal activity and challenges within the financial industry have intensified. Deepfakes have given fraudsters a new path to manipulate the financial service industry by impersonating clients, colleagues, investment advisors, business partners and more.
For example, earlier this year, a company in Hong Kong lost millions of dollars in a scheme that used deepfakes of the company’s senior executives, and other team members, in a video call with a junior employee. The employee had been suspicious about an email requesting a secret transaction, but the scammers looked and sounded just too convincing on the call. The employee then followed the fake instructions and made a series of bank transfers, totaling $25 million US from the company’s accounts to the scammer’s bank account.
Deepfakes challenge our traditional thoughts of authenticity and the criminal possibilities are endless. There is no perfect solution, and it will remain a constant game of cat and mouse.
The only way to help combat the future risks associated with deepfakes, is to arm yourself with ongoing education and implement a strong set of personal controls to avoid becoming the next deepfake target. - Brad