Phishing isn’t just email anymore—it’s become a multi-channel threat: voice, video, SMS, and more. Criminals are using AI-generated voices and videos to impersonate CEOs or create urgent scenarios. For instance, a WPP CEO deepfake scam used cloned voice and video calls to trick employees into thinking they were speaking with leadership source.
Phishing kits are now AI-powered and auto-generate convincing scams. Coalition warns these attacks are more frequent and harder to detect source. Security firms report that nearly 80% of breaches begin with a phishing attack, making this the top human risk vector source.
✅ Action steps:
-
Simulate realistic phishing drills (email, voice, video)
-
Use multi-factor authentication (MFA) and verify unusual requests via backup channels
-
Regularly update your incident response plan to include voice/video deepfakes