Deepfake alarm

Source: The post on Deepfakes has been created based on the article “Deepfake alarm” published in “The Hindu” on 24th November 2023.

UPSC Syllabus Topic: GS Paper 3 Science and Technology – Developments and their applications and effects in everyday life.

News: The article discusses the technological measures based on AI that can be used to tackle the rising challenge of Deepfakes.

A detailed article on Deepfakes can be read here.

What are Deepfakes?

Deepfakes are fake content in the form of videos, pictures and audio which are created using powerful artificial intelligence tools.
The technology involves modifying or creating images and videos using a machine learning technique called generative adversarial network (GAN).

What legal recourse can victims of deepfakes currently take in India?

  1. Reporting the post to the social media platforms, which are legally bound to address grievances relating to cybercrime and remove it within 36 hours as per the IT Act, 2000.
  2. A complaint can be lodged with the National Cyber Crime Helpline — 1930.

However, there is no comprehensive legislation at the time that directly addresses deepfakes and AI-related crimes.

What measures are being undertaken to tackle Deepfakes?

AI-based Tools: AI-models are being developed to counter Dark AI activities. These can be applied to slightly change the digital artwork in the back end, making it hard for AI models to train themselves on. For instance, Intel’s deepfake detector called FakeCatcher.

Development of a Technical Standard for Authentication: An open technical standard is being created by a consortium of software companies with an aim to authenticate digital pictures.

Fact-Check Tools: Social media intermediaries such as Facebook and X’s fact-check tools could help sort through AI content.

Question for practice:

Artificial Intelligence can play a significant part in dealing with challenges arising due to its own misuse. Discuss in the context of Deepfakes.

Print Friendly and PDF
Blog
Academy
Community