Recommended Stories
Technology continued to advance rapidly, ushering in a new era marked by innovations such as AI, AR, VR, and Meta in 2023.
While these technological developments have been beneficial for many, there is a growing concern among people about potential job losses and the use of advanced tools, particularly due to AI and Deepfakes.
The misuse of technology, especially in the form of Deepfakes, caused fear among the public in 2023.
Deepfakes involve manipulating content, such as swapping faces, changing clothing, and even replicating voices using AI.
Unfortunately, misinformation has spread widely, and here are some of the incidents reports in 2023:
CGI Robot Video
A video featuring a robot and a human playing table tennis went viral on social media.
However, it was later revealed that the video was manipulated using AI visual-effects software, falsely claiming the arrival of advanced robots.
The original footage showed real table tennis players, but the manipulated video spread with the hashtag 'Future is here.'
Tennis de table avec un robot 🏓🤖 (⚠️attention ce pourrait être un fake⚠️)
— Nawac7 (@Nawac7) August 4, 2023
CGI ou pas, qu’est-ce qu’ils font de notre pognon, bordel.
👍☝️🙏👌 pic.twitter.com/VvH3DJtSht
AI Clone Voices
Fake advertisements emerged, promoting deceptive schemes using cloned voices of popular figures like Shah Rukh Khan, Virat Kohli, Mukesh Ambani, Ratan Tata, Narayana Murthy, Akshay Kumar, and Sadhguru.
These videos, created with AI, deceived many on social media, leading to financial losses for some who fell victim to such scams.
This is how people are being scammed using AI generated clone voices. Be cautious, kahi ek ka double na hojaye🤣 aur phr tum Mukesh Ambani aur ratan Tata ko Blame kardo! #ScamAlert #AI pic.twitter.com/owefKwOq0v
— Khan Sajid (@mkhansajid) December 22, 2023
PM Narendra Modi Dancing
A video allegedly showing Prime Minister Narendra Modi dancing to Garba beats circulated widely.
However, it was the original creation of Mumbai-based businessman Vikas Mahante, shared during a Diwali mela in London.
Despite the video being genuine, there were false claims of AI and deepfake manipulation.
Pm Modi garba dance pic.twitter.com/yLO6Z4AvrJ
— Vijay Kumar (@VijayKu12446785) November 8, 2023
Tunnel Rescue Operation
Images depicting people inside a tunnel carrying the Indian national flag were shared on social media, falsely linked to a successful rescue operation in Uttarkashi.
The images, generated using AI tools, were mistakenly circulated as real visuals from the rescue operation.
Hindustan Times used an AI generated image of the Tunnel Rescue posted by a random twitter account and added caption as ‘rescue official pose….’
— Sachin (@Sachin54620442) November 29, 2023
The state of “journalism” in this country. pic.twitter.com/n9W6EcC6CT
Manipulated Celebrity Videos
Videos featuring Rashmika Mandanna and Kajol Devgan were manipulated and spread fear among people.
How extremely dangerous is this AI face swap deep fake videos?
— King Bimlli 😼 (@Bimlli_Shake) November 8, 2023
The government should ban deep fake apps and it should be made illegal.
The latest victim is actress Rashmika Mandana.
Fake vs Real pic.twitter.com/IJeNoYQqkz
The inappropriate content originated from a video by British-Indian influencer Zara Patel and the 'Get Ready With Me' video of fashion influencer Rosie Breen, where faces were morphed to mislead viewers.
As we approach the end of 2023, people must exercise caution and verify the authenticity of videos before sharing them online.
Looking ahead to 2024, the rise of AI and generative AI, along with the prevalence of deepfakes, poses a potential threat, especially with elections approaching in India.
Stay vigilant and verify information to combat the spread of misinformation.