Nitesh Kumar Sahoo

Several Indian celebrities including Rashmika Mandanna, Kajol, Katrina Kaif, Alia Bhatt, Priyanka Chopra, and Sonu Sood have already fallen prey to deepfake videos. Now, dancer and actress Nora Fatehi is the latest victim.

While such rampant use of technology is disturbing and has raised concerns among all, a morphed video of Nora promoting a brand has recently surfaced on social media. 

On Saturday, Nora took to her Instagram stories and posted a screenshot of the video in which the lady can be seen promoting a clothing brand. The advertisement featured a lady who had a close resemblance to Nora Fatehi.

She wrote, “SHOCKED!!! This is not me.”

Nora Fatehi's Instagram StoryNora Fatehi's Instagram Story

The video is created with utmost perfection and it would be difficult to prove that the lady in the video is not Nora. Her voice and body language have been near-perfectly copied in the video.

This comes hours after Delhi Police claimed to have arrested the main accused behind Rashmika’s deepfake video. As per reports, the accused, identified as Eemani Naveen, 24, is a resident of Andhra Pradesh’s Guntur district. He was responsible for creating, uploading, and circulating the deepfake video through social media.

Deepfake videos raising concerns

Deepfake video is a form of synthetic media crafted through the application of artificial intelligence (AI) and machine learning methodologies. This technology is being rampantly used for producing manipulated videos, which involves substituting a person’s face or voice in an existing video.

Amid the widespread outcry about the misuse of technology and the absence of proper legal safeguards, that have allowed this harmful use of artificial intelligence (AI) to flourish, several celebrities have fallen prey to it. 

After several similar videos went viral, the Centre issued an advisory to social media platforms, stressing the legal provisions covering deepfakes and the potential penalties linked to their creation and circulation.

scrollToTop