top of page
Search

“Is That Really You?”


The Hidden Dangers of Deepfake Video & Voice Calls


Imagine this:You get a video call from your boss. Their face appears. Their voice is familiar.They ask you to send over a document — or worse, approve a payment.


Would you do it?


What if I told you… that wasn’t really them?


Welcome to the world of deepfakes and deepcalls — where AI-generated voices and faces are used to deceive, manipulate, and scam.


🤖 What Are Deepfakes & Deepcalls?


  • Deepfake video: AI-generated faces or videos that look like a real person — talking, blinking, moving… but totally fake.

  • Deepfake voice (deepcall): AI mimics someone’s voice in real-time, making phone or video calls sound authentic.


🎯 Both use deep learning algorithms trained on recordings, photos, or videos of a real person.


🎯 Think this is science fiction?


Already happened:

  • CEO voice cloned in a fraud call → €220,000 stolen in Europe (source: [The Guardian, 2020])

  • Deepfake video of Zelenskyy posted online, telling Ukrainian soldiers to surrender

  • Scammers clone the voice of a child or relative to demand ransom or urgent help


🚨 Why Is This Dangerous?

Type of Attack

What Happens

🎥 Fake boss call (video)

Employee sends sensitive info or money

📞 Fake family emergency (voice)

You’re tricked into acting emotionally

🏦 Bank call impersonation

Login credentials or transfers are stolen

🎙️ Fake interviews or Zooms

Disinformation or infiltration


🔍 Quick Self-Check — Would You Fall for It?

✅ Have you ever acted quickly on a phone call from a superior?

✅ Would you trust a video call based on voice and face alone?

✅ Do you verify caller identity before sending info?


If you hesitated… you’re exactly who attackers hope to target.


🔐 Expert Tips: How to Stay Safe from Deepfakes


1. Never act on voice or video alone

👉 Always verify the request through a second channel (email, SMS, or face-to-face)


2. Have a code word for emergencies

🔒 Especially in families or small teams→ If someone calls in a panic: “Say the passcode.”


3. Pause and breathe — social engineers hate silence

Fake urgency is a red flag.❗ Deepcall attacks rely on you reacting before thinking.


4. Use video filters or low-quality calls for privacy

🛑 High-res video and audio makes cloning easier→ Ironically, blurry is safer.


5. Stay informed and educate others

🧠 The best defense is awareness.Train your team. Talk to your family. Share examples.


🧠 Final Thought from a Cybersecurity Expert:

“If a face can lie and a voice can trick you… trust must come from elsewhere.”

The age of deepfakes is here — but you can fight it with calm, logic, and layered verification.Don't trust what you hear or see.Trust what you confirm.

 
 
 

Comments


UK London

MSCS Support Remote LTD 

21 , Highfield Avenue, London

 

ITALY Milan

Via Carso, Azzan San Paolo

Bergamo, BG 24052

Lithuania

​Konstitucijos ,
Vilnius, Vilnius pr. 9-55

+39 351 278 3541

+447442951820

​+370 634 31101

Subscribe to Our Newsletter

Thanks for submitting!

Follow Us On:

  • Facebook
  • TikTok
  • Instagram

© 2023 by MSCS Support Remote

bottom of page