Hello Everyone, today we’re gonna be discussing about an interesting topic through which we can imitate others or make others imitate us.

It is Deepfake

If you want, you can also check out the Audio Version of this post below

Deepfakes can include anything from audio to video content that appear to look and sound exactly similar to real content. However, the difference is – it’s completely fake. The origin of the term deepfake comes from a pair of deep learning, the technology that makes it possible, and fake

But the danger of that is the technology can be used to make people believe something is real when it is not. This technology has to be considered on top of the already climbing phone fraud rate. Over the last 5 years, it has increased over 350%. As tools used in deepfakes become easily available, consumers need to be aware of the fraudlent activities.

How Deepfakes work?

Deepfakes exploit this human tendency using generative adversarial networks (GANs), in which two machine learning (ML) models are present. Machine Learning is the study of computer algorithms that improve automatically through experience. It is seen as a subset of artificial intelligence .

These algorithms are used to create perfected fake Videos and Audios for the Deepfake uses.

One ML model trains on a data set and then creates video forgeries, while the other attempts to detect the forgeries. The forger creates fakes until the other ML model can’t detect the forgery.

The larger the set of training data, the easier it is for the forger to create a believable and un identified Deepfake. This is why videos of former presidents and Hollywood celebrities are most used nowadays. There’s a ton of publicly available video footage to train the forger.

Here is a example video of  in which Obama is being Deepfaked.

Source : BuzzFeedVideo | Youtube

Warning : Language Violence

Technologies used in deepfake

It is hard to make a good Deepfake on a standard computer. Most are created on high-end devices with powerful graphics cards. This reduces the processing time from days to hours. But it takes expertise, too, not least to touch up completed videos to reduce flicker and other visual defects. Nowadays there are plenty of tools are now available to help people make Deepfakes. Several companies will make them for you and do all the processing in the cloud. There’s even a mobile app, Zao, that lets users add their faces to a list of TV and movie characters on which the system has trained.

How to detect deepfakes?

Detecting deepfakes is a hard problem. Amateur deepfakes can be detected by the naked eye. Other signs that machines can spot include a lack of eye blinking or shadows that look wrong. GANs(A generative adversarial network is a class of machine learning frameworks designed by Ian Goodfellow) that generate deepfakes are getting better all the time, and soon we will have to rely on digital forensics to detect deepfakes.

When AI can be used to make deepfakes, it can also be used to detect them, With the technology becoming accessible to any computer user, more and more researchers are focusing on detecting Deepfakes.

Presently, there are slight visual aspects that can be noticed if you look closer, anything from the ears or eyes not matching to fuzzy borders of the face or too smooth skin to lighting and shadows.But he said that detecting it is getting harder and harder as the deepfake technology becomes more advanced and videos look more realistic.


We’re living in the age of Information and it’s also important that the  information that we get is from a legitimate source and not some fake crap like Deepfake. It is also quite hard to find wheteher it’s a real one as technologies like AI and ML are growing rapidly and also the people who misuse for unwanted purposes.

We’re coming to a place where artificial intelligence is required to counter artificial intelligence. Deepfake videos are a total abomination and how completely our digital reality can be re-written without our permission tomorrow. What do you think about it?

Leave your replies in the comments below:)