

Cybersecurity firm Cybertrace has issued a stark warning regarding a highly convincing deepfake video featuring Australian mining magnate and businessman Andrew "Twiggy" Forrest. This sophisticated case of digital manipulation represents a growing threat in the intersection of artificial intelligence technology and financial fraud.
The deepfake video, which has surfaced on social media platforms, shows Forrest promoting a fake cryptocurrency trading platform that promises high returns to unsuspecting investors. Deepfake technology uses artificial intelligence and machine learning algorithms to create realistic but entirely fabricated video content by superimposing one person's face onto another's body or manipulating their facial expressions and speech patterns. In this case, scammers have leveraged this technology to create a fraudulent endorsement that appears remarkably authentic.
The video, which initially appeared on Facebook, encourages users to sign up for a fraudulent platform claiming to generate substantial daily profits for "ordinary people." The psychological manipulation employed in the video targets individuals seeking financial opportunities, promising unrealistic returns that are characteristic of investment scams. The victims are directed to a website called "Quantum AI," a name that Cybertrace says is intertwined with scams and financial fraud. This platform falsely claims to use advanced AI algorithms for cryptocurrency trading, exploiting the public's limited understanding of both artificial intelligence and digital asset markets.
In a comment, Cybertrace CEO Dan Halpin claimed that the deepfake video could deceive even cautious individuals, as the scammers behind it appear to possess professional sales expertise and marketing knowledge. The sophisticated nature of this scam extends beyond just the technical creation of the deepfake itself.
Halpin further noted its length and repetitive nature, which enhances its convincing power and psychological impact on viewers. "The video is long and in many ways repetitive, which can be quite convincing, and appears to have been created by someone with knowledge of sales and marketing," Halpin explained. This repetition serves a psychological purpose, reinforcing the fraudulent message and creating a false sense of legitimacy through familiarity.
The deepfake manipulates Forrest's behavior and body language with remarkable precision, utilizing footage from a "fireside chat" conducted by Rhodes Trust in October. This demonstrates the scammers' ability to source high-quality source material and apply advanced AI manipulation techniques to create a seamless fabrication. The choice of using footage from a respected institution like Rhodes Trust adds an additional layer of perceived credibility to the scam.
Cybertrace detected the deepfake video on January 27, showcasing an AI-altered version of Forrest endorsing a fictitious cryptocurrency trading software. In the video, the altered version of Forrest promises viewers the opportunity to join him and his team as partners in the world's most intelligent stock and cryptocurrency trading software, guaranteeing substantial profits regardless of market conditions. Such promises of guaranteed returns in volatile markets like cryptocurrencies are classic red flags of investment fraud.
Forrest, a former CEO of Western Australian mining firm Fortescue Metals Group, is a highly successful entrepreneur with a net worth of $29.4 billion. The scammers deliberately chose such a prominent and respected business figure to lend false credibility to their scheme, knowing that his reputation for business success would make the fraudulent endorsement more persuasive to potential victims.
The deepfake video concludes with Forrest urging viewers to sign up for the platform before it's too late, adding an element of urgency to the scam. This artificial time pressure is a common manipulation tactic designed to prevent potential victims from conducting proper due diligence or seeking advice before making investment decisions.
Cybertrace has cautioned users to exercise extra vigilance due to the recent surge in deepfake fraud targeting high-profile individuals. The proliferation of accessible AI tools has made it easier for criminals to create convincing deepfake content, leading to an exponential increase in such fraudulent schemes across the globe.
In addition to Forrest, other notable Australian individuals such as Gina Rinehart, the country's richest person, entrepreneur Dick Smith, and TV host Allison Langdon have also been targeted by scammers using deepfake videos, as highlighted by Cybertrace. This pattern suggests an organized effort by fraud networks to exploit the images and reputations of trusted public figures to perpetrate financial crimes. The targeting of multiple prominent Australians indicates that scammers are specifically focusing on regional markets where these figures hold significant influence and trust.
As reported, Lee Hsien Loong, the prime minister of Singapore, has also warned his social media followers regarding the use of deepfake videos that use his voice and image to promote cryptocurrency scams. In his warning, Loong even shared an example video of himself being interviewed, which was created by scammers to endorse a fraudulent "hands-free crypto trading" scheme. By publicly addressing the issue, Loong demonstrated the importance of awareness and education in combating these sophisticated fraud attempts.
"The use of deepfake technology to spread disinformation will continue to grow," said Loong. "We must remain vigilant and learn to protect ourselves and our loved ones against such scams." This statement underscores the evolving nature of digital threats and the need for continuous public education about emerging fraud techniques.
Scammers have been employing various methods to deceive individuals and steal their fiat currency or tokens since the inception of cryptocurrencies. The anonymous and irreversible nature of cryptocurrency transactions makes them particularly attractive targets for fraudsters. In 2020, hackers compromised the accounts of prominent Twitter users, including former United States President Barack Obama and President-elect Joe Biden, to promote a Bitcoin scam. This incident demonstrated how social engineering and platform vulnerabilities can be combined with cryptocurrency fraud to target large audiences.
To protect against deepfake-related financial fraud, individuals should verify any investment opportunity through official channels before committing funds. Never rely solely on social media videos or unsolicited investment advice, regardless of who appears to be endorsing it. Check for inconsistencies in the video such as unnatural facial movements, unusual lighting, or audio synchronization issues that may indicate manipulation.
Additionally, be skeptical of any investment opportunity that promises guaranteed returns or creates artificial urgency for immediate action. Legitimate investment opportunities do not require rushed decisions, and no investment can guarantee profits regardless of market conditions. Always conduct independent research and consult with licensed financial advisors before making significant investment decisions, especially in volatile markets like cryptocurrencies.
Deepfake uses AI to create realistic fake videos and audio. Scammers use it to impersonate celebrities or officials, convincing victims to transfer funds or reveal sensitive information through deceptive content.
Verify video details carefully by checking background consistency, audio quality, and facial movements. Cross-reference with official social media accounts. Look for unnatural expressions, audio mismatches, or unusual lighting. Contact the celebrity through verified channels to confirm authenticity before trusting any investment claims.
Yes, fund recovery is possible. Act immediately within 24 hours to freeze accounts through your bank and contact professional cross-border asset recovery lawyers. Quick action maximizes recovery chances through legal and financial channels.
Deepfake financial fraud violates fraud statutes and identity laws. Criminals face imprisonment of 3-10 years, plus fines and restitution requirements under most jurisdictions' criminal codes.
Deepfake technology enables fraudsters to impersonate financial authorities and manipulate market information, causing unauthorized fund transfers and market volatility. This threatens investor confidence and financial market stability.
Verify directly with official channels before investing, never trust unsolicited celebrity endorsements, research the investment thoroughly, and report suspicious activity to authorities immediately. Always work with registered financial advisors.











