Deep Fake Menace

‘Deep Fakes Threat Must Be Fought With Tech & Legal Devices’

Deep Fake Menace

Shakti Singh Tanwar, a cyber security & tech expert, says use of Deep Fakes throws up possibilities that are both fascinating and scary. His views:

In the realm of multimedia graphics, technological advancements have ushered in a new era, one that blurs the lines between reality and fiction. Deep Fakes, a portmanteau of “deep learning” (processing data like human brain) and “fake”, have emerged as a cutting-edge technique in the field of artificial intelligence (AI) and machine learning. As a multimedia graphics expert, I find myself grappling with the implications of this technology, understanding how it is executed, and acknowledging the inherent dangers that come with it.

So, what is a Deep Fake? It is fabricated media (audio/videos) created by using deep learning models. The model is trained in the voices and mannerism of an individual to generate fake video/audio of the person concerned. Depending on how trained the model is, it’s difficult to distinguish fake videos from real ones. One recent example of this that has been in news was a fake video of Rashmika Mandanna.

Deep Fakes involve the use of deep learning algorithms to create realistic and often convincing manipulations of audio and visual content. By leveraging powerful neural networks, these algorithms can seamlessly replace faces, voices, or even entire scenarios in videos.

The complex process typically involves training the AI model on vast datasets of images and videos, allowing it to learn the subtle nuances of facial expressions, voice tones, and other distinctive features.

The danger lies not only in the potential for misuse but also in the sophistication of the technology, making it increasingly difficult to distinguish between the authentic and the manipulated content. This has prompted global leaders, including Indian Prime Minister Narendra Modi, to issue warnings about the risks associated with Deep Fakes. Deep Fakes pose a threat to the foundations of trust and authenticity in an increasingly digitalized society.

The creation of Deep Fakes requires a deep understanding of AI, machine learning, and multimedia graphics. Advanced tools, such as generative adversarial networks (GANs), are employed to refine the realism of manipulated content by pitting two neural networks against each other—one generating Deep Fakes, and the other discerning real from fake.

Artificial Intelligence has been the buzz word for some time now. More than 750 startups have started to work on AI-related stuff in last one year in the US itself. Given the powers AI possess, the number is expected to grow. But with great powers comes great responsibility.

ALSO READ: ‘Govt Must Take Note Of Deep Fake Menace’

Some key aspects that are involved in deep fakes are generating realistic images/videos and audios, face swapping etc. It’s very easy to superimpose one face to another body – something done so far with Photoshop for fun. But Photoshop results were not convincing and one could only manipulate still images. With deep fakes we can manipulate full length videos as well.

Deep Fakes are illegal and have huge consequences. In today’s world which heavily relies on social media tools for information it is easy to spread hoaxe and panic in society. Lots of people have recently raised voices regarding misuse of Deep Fakes, including Prime Minister Modi and Bollywood star Amitabh Bachchan.

There have been efforts to develop tools and techniques that can detect Deep Fakes. Some approaches and tools to identify deep fakes are: 1) Microsoft Video Authenticator (Not available for general use); 2) Sensity Top Deepfake Detection Solution | New AI Image Detection; 3) Deepware scanner.

Combatting the misuse of Deep Fake technology demands a multi-faceted approach. One avenue involves developing sophisticated detection tools that can analyze videos and identify anomalies that betray the presence of manipulation. These tools may leverage AI algorithms themselves to scrutinize content for inconsistencies or tampering. Researchers are continually refining these tools to keep pace with the evolving sophistication of Deep Fake technology.

From a regulatory perspective, there is a growing need for laws and policies that address the ethical implications of Deep Fakes. Striking a balance between innovation and safeguarding against malicious use requires a collaborative effort between governments, technology developers, and the wider public. 

My perspective on Deep Fakes is one rooted in both fascination and concern. The power of AI to manipulate audio and visual content opens a realm of creative possibilities, but the potential for misuse demands a vigilant response from the technological community. By developing advanced detection tools, promoting media literacy, and establishing ethical guidelines, we can work towards harnessing the potential of Deep Fake technology responsibly and preserving the integrity of our digital reality.

As told to Deepti Sharma

For more details visit us:

4.3 6 votes
Article Rating
Notify of
1 Comment
Newest Most Voted
Inline Feedbacks
View all comments
6 months ago

valid points raised by Mr Tanwar

We use cookies to give you the best online experience. By agreeing you accept the use of cookies in accordance with our cookie policy.

Privacy Settings saved!
Privacy Settings

When you visit any web site, it may store or retrieve information on your browser, mostly in the form of cookies. Control your personal Cookie Services here.

These cookies are essential in order to enable you to move around the website and use its features. Without these cookies basic services cannot be provided.

Cookie generated by applications based on the PHP language. This is a general purpose identifier used to maintain user session variables. It is normally a random generated number, how it is used can be specific to the site, but a good example is maintaining a logged-in status for a user between pages.

Used on sites built with Wordpress. Tests whether or not the browser has cookies enabled
  • wordpress_test_cookie

In order to use this website we use the following technically required cookies
  • wordpress_test_cookie
  • wordpress_logged_in_
  • wordpress_sec

Decline all Services
Accept all Services
Would love your thoughts, please comment.x