
Get 24/7 incident response assistance from our global team
- APAC: +65 3159 4398
- EU & NA: +31 20 890 55 59
- MEA: +971 4 540 6400
Get 24/7 incident response assistance from our global team
Please review the following rules before submitting your application:
1. Our main objective is to foster a community of like-minded individuals dedicated to combatting cybercrime and who have never engaged in Blackhat activities.
2. All applications must include research or a research draft. You can find content criteria in the blog. Please provide a link to your research or research draft using the form below.

Advancements in AI voice cloning and weaknesses in telecom systems have given rise to deepfake vishing attacks, where cybercriminals combine cloned voices with caller ID spoofing to impersonate trusted executives or officials. With just a few seconds of audio-often taken from public sources-fraudsters can create highly convincing voice replicas at minimal cost and with no technical expertise.
To demonstrate this threat, Group-IB and Channel News Asia (CNA) conducted an experiment where a journalist’s voice was cloned using an online platform. The test revealed how quickly and cheaply realistic deepfake voices can be generated, exposing the growing risks to businesses and individuals. This experiment highlights the urgent need for stronger identity verification, awareness training, and defenses against AI-driven social engineering.
Group-IB’s security ecosystem provides comprehensive protection for your IT infrastructure based on our unique cyber intelligence and deep analysis of attacks and incident response.