Beautiful vacation spots, summer weather, fun times with friends, and occasionally, someone crying as if they just had the worst day in their life. If you are active on social media, the last few days you might have seen unreasonably many of your friends and influencers looking totally devastated. It started a few years ago with funny face switching camera effects on smartphones. Low quality leading to weird mixed-up faces. With apps showing you how you would look like a grandparent or a little toddler. During the last few years these filters have become much more realistic. Deepfake technology has improved so much that we have already had a political scandal last year when Dutch politicians were having a zoom call with Alexei Navalny’s chief of staff, Leonid Volkov. They found out only weeks later, that they were in fact in conversation with someone impersonating the Russian via deepfake technology.

Since we first got in contact with deepfake technology, it has massively improved. So much that in a digital age with decisions made via zoom, telephone calls or by email it can pose as a real threat. Who would have thought that the face swich filter could one day fool a bunch of politicians? What happened to them could happen anywhere.

What is deepfake?

So, what is deepfake exactly? There are two types of deepfake we often see in a combination. Firstly, and most prominent there is face alteration, where another person’s face is laid on top of an intruder’s face giving him or her the power to take videos of the targeted person without them being there. For that the bad guy only needs a few minutes of video material showing the targets face. This combined with voice conversion, where the intruder’s voice is altered to sound exactly like the target person’s, can pose a real threat. Especially in near future, with deep learning technology and thus quality of the deepfake evolving rapidly.

Threats of deepfake

What sounds like a gadget from Mission Impossible poses a real threat in numerous scenarios. On one hand it could and probably will create political and sociological dangers. For example, an unflattering video of a politician just before an election or a celebrities making controversial statements they never said, threatening their career. On the other hand, it will bring scam, like phishing to a whole new level. Like it was the case with the Dutch politicians.


Currently deepfake technology is still in a state where being aware, knowing the danger exists should be enough to protect yourself. Typical errors in deep-faked videos are visible transitions especially at the edge of the face, sharp contours blurred and limited facial expressions. Also lighting in the video should be paid attention to, often the lighting on the face does not fit with the backgrounds. Deep-faked audio usually too can be detected when listening carefully. Typical is a metallic sound or a monotone and false pronunciation. If you are in a zoom call with a deepfake, like the Dutch politicians, the fake person will most probably stand out with a long delay.

If you are curious if you would be fooled, test your deepfake knowledge here Spot the Deepfake (

But what can be done if quality has improved so much, that all these errors will be solved? First of all, as deepfake technology is improving so is technology for deepfake detection. There are different companies working on software helping analysts to detect manipulated material like the Defence Advanced Research Projects Agency (DARPA) a division of the US Department of Defence working on SemaFor and Microsoft, who released the Microsoft Video Authenticator tool end of 2020 already.

Other than that, there is also a digital signature in development, attaching material to a unique source showing it has not been altered since recording.

Tech Radar

If you are interested in new technology and like to keep track of the rapidly evolving field of tech, consider checking out the tech radar on the BCP (Home - Tech Radar - Bertelsmann Collaboration Platform). It is currently being updated and restructured, feel free to bring in ideas of your own or just join to find out about the upcoming and emerging trends in the market.