AI Generated Sexual Violation Which Is Untraceable
Does anybody know what “The Fappening” was? No? I won’t blame you for that. In the age of the Internet, everybody has a limited shelf memory. It happened back in 2014 – almost 500 nude pictures of celebrities, mostly females, got leaked on the Internet via Imgur and Reddit. Hackers, via phishing attacks, got into these accounts and then dumped all their information into sub-Reddits.
This caused a major uproar as well as a widespread concern about what technology actually entails and how easy it is to hack into the accounts, which are private. We thought this was the ultimate low.
Jennifer Lawrence, whose pictures were among the lot, had then said to Forbes,
“It is not a scandal. It is a sex crime, it is a sexual violation. It’s disgusting. The law needs to be changed, and we need to change. That’s why these websites are responsible. Just the fact that somebody can be sexually exploited and violated, and the first thought that crosses somebody’s mind is to make a profit from it. It’s so beyond me. I just can’t imagine being that detached from humanity. I can’t imagine being that thoughtless and careless and so empty inside.”
However, there was still some solace left that if anyone doesn’t have any pictures like that no one could frame him/her for the same.
After all, if there are no videos to leak, what would the hackers leak?
Hacking Is Now Old School – What If You Could Make Rather Than Break
Now in 2018, people don’t even need to hack into accounts or get compromising videos. All they need is a set of images, with which they can train an AI and voila! You can impose someone’s face in a porn video via that technology and no one will know the difference. The requirement is minimal – a set of pictures that can enable the AI to learn and act accordingly.
Celebrities, by default, become vulnerable to this kind of attacks, as they already have a lot of pictures, thanks to their celebrity status, floating around. This has led to porn videos of several celebrities, such as Daisy Ridley, Gal Gadot, Katy Perry and many more. These videos are being called deepfakes.
Deepfakes are videos, where, via machine learning, you can map someone’s face over another person’s in a video. This might sound childish or even like a clever prank. However, when have things, which have a scope for causing harm, not, been used to do the damage? The name of the app which makes it possible is aptly called the “FakeApp”.
It’s not a philosophical question; it’s just the simple truth, as simple as some guy with a computer using thirty pictures of his ex and this app to create a porn video. It’s fake but by the time someone takes off the layers and exposes it, the damage is done. In times of social media, fake news spreads like wildfire. There is no reason for anyone to expect that a fake porn video will not be circulated in the same way.
Enter Revenge Porn – DeepFakes And Selfie
Revenge porn is something that we are aware of, but it could be prevented to a large extent if one partner prevents the other from taking or saving any visuals of their physical intimacy. However, when it comes to deepfakes, normal pictures would do just fine. In a selfie driven generation, how much time will it take? All someone has to do is use the pictures, already available on Facebook or Instagram, and map the faces.
You might think that there is a solution to it. After all, there must be some way or the other to stop this mayhem. Frankly, even if there is, no one knows that at this point in time. The main Reddit page for the FakeApp might have been deleted but its torrent is still out there in the market. I should know that because while researching for this article, I actually found four sites, which are hosting this torrent. I don’t think that the craze will die out anytime soon. Some might use it as a quirk, like several Reddit users out there, who are using this software to put Nicholas Cage in any movie clip that they can get their hands on.
However, the majority of them will be instances, such as this –
“Super quick one – just learning how to retrain my model. Around 5ish hours – decent for what it is,” posted Redditor UnobtrusiveBot after putting actress Jessica Alba’s face on adult star Melanie Rios’ torso. There was even a query like this – “Is it possible to edit your crush in a porn video?”
The app in question has had three revisions done to it already to make it more robust and user-friendly. While it keeps on parading as an application that “allows users to easily create realistic high-quality faceswap videos,” the reality of what it actually is capable of doesn’t escape the notice of others.
This is an ongoing disaster and a by-product of technology, which one creates to innovate but those who get their hands on it, end up using it for their own perverse needs.