Rashmika Mandanna Deepfake: Loan Apps Harass Women With Morphed Porn Pics: Singer Chinmayi’s Claim


I truly hope there is a nationwide awareness campaign that can kickstart urgently, she said

New Delhi:

After a shocking deepfake video of actor Rashmika Mandanna raised concerns over the dangers of artificial intelligence, Singer Chinmayi Sripaada alleged such videos are being used not just to harass celebrities but common people as well. The singer, in a post in support of the actor, called for legal action against misuse of such AI technology and also raised alarm over the “next weapon to extort, blackmail and rape” women.

“Deep Fake is going to be the next weapon they use to target and harass and blackmail girls to extort, blackmail and rape. Their clueless families in one small village or town is not going to understand when the honour is at stake,” the singer said in a post on X, formerly Twitter.

Ms Sripaada also alleged that women who have borrowed money from loan apps are being harassed by collectors who morph their images over “porn photos” in order to extort money.

“A Deep Fake is going to be tougher for the usual untrained eye to spot. I truly hope there is a nationwide awareness campaign that can kickstart urgently to educate the general public about the dangers of deepfakes for girls and to report incidents instead of taking matters into their own hands,” she said.

The shocking video, that went viral this week, showed what appeared to be Ms Mandanna entering an elevator. In reality, it was a video of British-Indian influencer Zara Patel doctored using deepfake technology to replace her face with Ms Mandanna’s.

Reacting to the trending video, the actor said the whole ordeal has left her “extremely scary”. ”I feel really hurt to share this and have to talk about the deepfake video of me being spread online, she said.

The central government has taken cognisance of the deepfake video and sent an advisory to social media platforms.

Leave a Comment