Seven years later, deepfakes remain a growing threat to Indian public figures and everyday people alike, with technology advancing faster than laws.
The Real Issue
Deepfakes represent a technological violation that goes beyond traditional hacking—they create entirely fake but convincing intimate content without consent, making it impossible for victims to prove they were never involved. While legislation is catching up globally, India still lacks comprehensive legal frameworks to prosecute deepfake creators effectively.
AI Generated Sexual Violation Which Is Untraceable
Does anybody know what “The Fappening” was? No? I won’t blame you for that. In the age of the Internet, everybody has a limited shelf memory. It happened back in 2014 – almost 500 nude pictures of celebrities, mostly females, were leaked on the Internet via Imgur and Reddit. Hackers, via phishing attacks, got into these accounts and then dumped all their information into sub-Reddits.
This caused a major uproar as well as a widespread concern about what technology actually entails and how easy it is to hack into the accounts, which are private. We thought this was the ultimate low.
Jennifer Lawrence, whose pictures were among the lot, had then said to Forbes,
“It is not a scandal. It is a sex crime, it is a sexual violation. It’s disgusting. The law needs to be changed, and we need to change. That’s why these websites are responsible. Just the fact that somebody can be sexually exploited and violated, and the first thought that crosses somebody’s mind is to make a profit from it. It’s so beyond me. I just can’t imagine being that detached from humanity. I can’t imagine being that thoughtless and careless and so empty inside.”
However, there was still some solace left, that if anyone didn’t have any pictures like that then no one could frame him/her for the same.
After all, if there are no videos to leak, what would the hackers leak?
Hacking Is Now Old School – What If You Could Make Rather Than Break
Now in 2018, people don’t even need to hack into accounts or get compromising videos. All they need is a set of images, with which they can train an AI and voila! You can impose someone’s face in a porn video via that technology and no one will know the difference. The requirement is minimal – a set of pictures that can enable the AI to learn and act accordingly.
Celebrities, by default, become vulnerable to this kind of attacks, as they already have a lot of pictures floating around, thanks to their celebrity status. This has led to porn videos of several celebrities, such as Daisy Ridley, Gal Gadot, Katy Perry and many more. These videos are being called deepfakes.
Deepfakes are videos, where, via machine learning, you can map someone’s face over another person’s in a video. This might sound childish or even like a clever prank. However, when have things, which have a scope for causing harm, not been used to do damage? The name of the app which makes it possible is aptly called the “FakeApp”.
It’s not a philosophical question; it’s just the simple truth, as simple as some guy with a computer using thirty pictures of his ex and this app to create a porn video. It’s fake but by the time someone takes off the layers and exposes it, the damage is done. In times of social media, fake news spreads like wildfire. There is no reason for anyone to expect that a fake porn video will not be circulated in the same way.
Enter Revenge Porn – DeepFakes And Selfie
Revenge porn is something that we are aware of, but it could be prevented to a large extent if one partner prevents the other from taking or saving any visuals of their physical intimacy. However, when it comes to deepfakes, normal pictures would do just fine. In a selfie driven generation, how much time will it take? All someone has to do is use the pictures, already available on Facebook or Instagram, and map the faces.
You might think that there is a solution to it. After all, there must be some way or the other to stop this mayhem. Frankly, even if there is, no one knows that at this point in time. The main Reddit page for the FakeApp might have been deleted but its torrent is still out there in the market. I should know that because while researching for this article, I actually found four sites, which are hosting this torrent. I don’t think that the craze will die out anytime soon. Some might use it as a quirk, like several Reddit users out there, who are using this software to put Nicholas Cage in any movie clip that they can get their hands on.
However, the majority of them will be instances, such as this –
“Super quick one – just learning how to retrain my model. Around 5ish hours – decent for what it is,” posted Redditor UnobtrusiveBot after putting actress Jessica Alba’s face on adult star Melanie Rios’ torso. There was even a query like this – “Is it possible to edit your crush in a porn video?”
The app in question has had three revisions done to it already to make it more robust and user-friendly. While it keeps on parading as an application that “allows users to easily create realistic high-quality faceswap videos,” the reality of what it actually is capable of doesn’t escape the notice of others.
This is an ongoing disaster and a by-product of technology which one creates to innovate, but those who get their hands on it end up using it for their own perverse needs.
Frequently Asked Questions
What exactly are deepfakes and how are they created?
Deepfakes use artificial intelligence and machine learning to superimpose someone’s face onto another person’s body in videos or images. As mentioned in the article, as few as 50 high-quality images can be used to create a convincing fake face, making the technology frighteningly accessible to anyone with basic AI knowledge.
Why is deepfake revenge porn worse than traditional hacking incidents like The Fappening?
Unlike The Fappening where stolen real images were leaked, deepfake victims cannot prove they never created or participated in the intimate content. This makes it harder to establish proof of violation and defend oneself legally in India’s current judicial system.
Are there Indian laws against creating and sharing deepfake revenge porn?
India is still developing comprehensive legislation specifically targeting deepfakes. While sections of IPC and IT Act can be applied, there’s no dedicated deepfake law yet. Several states and the central government are working on bills to criminalize non-consensual intimate deepfakes.
Which Indian celebrities have been targets of deepfake content?
Multiple Indian actors and public figures have fallen victim to deepfake sexual content, though many don’t publicly report it due to stigma. The non-consensual nature of this content violates personal dignity and privacy rights, yet victims often face social backlash rather than sympathy.
What can individuals do to protect themselves from deepfake exploitation?
Limit sharing intimate photos online, use strong privacy settings on social media, and report deepfakes immediately to platforms and authorities. Raising awareness about deepfakes and supporting stronger legislation in India remains crucial for collective protection against this emerging violation.
Disclosure: This article may contain affiliate links. VoxSpace may earn a small commission if you make a purchase through these links, at no additional cost to you. This does not influence our editorial opinions or reviews.
You Might Also Like
- [VoxSpace Exclusive] India's Slippery Slope For Porn : School Girls, Child Porn And Whatsapp Videos
- [VoxSpace Life] Violence Monitoring : Important But Not So Feasible Reactionary Law By Australia
- [VoxSpace Life] Sunny Leone Or Exploitative Porn, You Choose
- [VoxSpace Tech] Are Program And Bot Generated Fake Likes Misleading Us?
- [VoxSpace Tech] Gradient Descent : Indiau2019s First Exhibition Of Art Created By Artificial Intelligence
- [VoxSpace Tech] Modern Day Tech Devices Which Are Helping The Specially Abled Lead A Better Life
- [VoxSpace Selects] The Evolution Of Paul Thomas Anderson : Part One
Comments are closed.