The Voice of Fraud: Deepfake Vishing and the New Age of Social Engineering
← Research Hub

The Voice of Fraud: Deepfake Vishing and the New Age of Social Engineering

Group-IB’s Fraud Protection team examines the rise of deepfake vishing attacks and caller ID spoofing, exposing how cybercriminals exploit AI voice cloning and weaknesses in telecom systems. It highlights real-world cases, explains the technical methods behind these scams, and identifies vulnerabilities in organizational trust models, and provides actionable strategies for telecom providers, corporations, and individuals to defend against this evolving threat.

Background

Advancements in AI voice cloning and weaknesses in telecom systems have given rise to deepfake vishing attacks, where cybercriminals combine cloned voices with caller ID spoofing to impersonate trusted executives or officials. With just a few seconds of audio-often taken from public sources-fraudsters can create highly convincing voice replicas at minimal cost and with no technical expertise.

To demonstrate this threat, Group-IB and Channel News Asia (CNA) conducted an experiment where a journalist’s voice was cloned using an online platform. The test revealed how quickly and cheaply realistic deepfake voices can be generated, exposing the growing risks to businesses and individuals. This experiment highlights the urgent need for stronger identity verification, awareness training, and defenses against AI-driven social engineering.

In this report

AI-powered voice cloning is making vishing attacks more convincing than ever, with global losses from AI-enabled fraud projected to reach US$40 billion by 2027, up from US$12 billion in 2023.AI-powered voice cloning is making vishing attacks more convincing than ever, with global losses from AI-enabled fraud projected to reach US$40 billion by 2027, up from US$12 billion in 2023.

Publicly available voice cloning platforms require only seconds of audio and minimal technical expertise, with subscription fees as low as a few dollars per month—putting powerful impersonation tools in the hands of cybercriminals.Publicly available voice cloning platforms require only seconds of audio and minimal technical expertise, with subscription fees as low as a few dollars per month—putting powerful impersonation tools in the hands of cybercriminals.

Weaknesses in global telecom infrastructure allow attackers to spoof trusted phone numbers, making fraudulent calls appear legitimate and bypassing traditional security checks.Weaknesses in global telecom infrastructure allow attackers to spoof trusted phone numbers, making fraudulent calls appear legitimate and bypassing traditional security checks.

Real cases, including a US$243,000 scam in the UK and an $18.5 million stablecoin theft in Hong Kong, show how combined voice cloning and caller ID spoofing can result in major financial and reputational losses.Real cases, including a US$243,000 scam in the UK and an $18.5 million stablecoin theft in Hong Kong, show how combined voice cloning and caller ID spoofing can result in major financial and reputational losses.

In a controlled test with Channel News Asia (CNA), a journalist’s voice was cloned using a free trial of a voice cloning platform, demonstrating how quickly and affordably realistic deepfakes can be created - reinforcing the urgent need for multi-factor verification and awareness training.In a controlled test with Channel News Asia (CNA), a journalist’s voice was cloned using a free trial of a voice cloning platform, demonstrating how quickly and affordably realistic deepfakes can be created - reinforcing the urgent need for multi-factor verification and awareness training.

Advanced protection against cyber threats

Group-IB’s security ecosystem provides comprehensive protection for your IT infrastructure based on our unique cyber intelligence and deep analysis of attacks and incident response.