Due to the rapid rise in popularity of this subreddit, the practice and use of this technology has come under increasing scrutiny and criticism. Deepfakes has started to garner the attention of several national news organizations, and with that comes the distain and ire of the public and those who represent them.
To those who condemn the practices of this community, we sympathize with you. What we do here isn't wholesome or honorable, it's derogatory, vulgar, and blindsiding to the women that deepfakes works on.
That said, the work that we create here in this community is not with malicious intent. Quite the opposite. We are painting with revolutionary, experimental technology, one that could quite possibly shape the future of media and creative design.
There is no difference between realistic videos and realistic still images, which have existed for decades on the internet without issue, and have become a mainstream medium of viewing your favorite celebrities in a different lighting than they usually appear in. Which is a fancy way of saying it shouldn't matter we've been doing this for years now.
This technology is very new. And new scares people. A lot. But while the circumstances may be different, and the applications to this technology far exceed that of, say, Photoshop, the end result is the same. Faked images.
There are two arguments I'd like to set out here.
The first one is the possibility and plausibility of deepfakes technology being used to cause serious damage to someone's reputation, or even be used to forge evidence in court.
While incendiary and terrifying as this might be, this is genuinely a non issue. Returning to Photoshop, which is far more accessible and user friendly than deepfakes is, has existed for years. I have never heard of a revenge porn photo created through photoshop that was taken seriously by peers, nor any celebrity fakes that were taken as real by the public. This same principle also applies for forged evidence in court cases.
There are many ways to tell if a photo or video has been doctored. Looking at images through certain filters can immediately show what is real and what is fake.
Also, in an ironic twist of fate, it actually appears that as deepfakes becomes more mainstream, revenge porn actually becomes LESS effective, not more effective. If anything can be real, nothing is real. Even legitimate homemade sex movies used as revenge porn can be waved off as fakes as this system becomes more relevant.
The final thing I'd like to say is this. While it might seem like the LEAST moral thing to be doing with this technology, I think most of us would rather it be in the hands of benign porn creators shaping the technology to become more focused on internet culture, rather than in the hands of malicious blackhats who would use this technology exclusively to manipulate, extort, and blackmail.
EDIT: Something I should make VERY clear:
What I said about deepfakes being in the hands of the "good" guys never meant that it WOULDNT then be in the hands of the bad guys. What I meant (and should have clarified) is that if the technology is in the hands of everyone, rather than a distinct few, then the bad guys cant USE it maliciously, or at the very least it will be much harder. Imagine if only two people had photoshop. Anything they would create would be deemed real and valid without intense scrutiny of their work. But since everyone has or knows about photoshop, we casually pass off celebrity nudes as fakes until proven otherwise.
What I'm trying to say is that if deepfakes is widely known, its much harder for extortionists to extort people with something that can be widely recognized as illegitimate.
No matter what happens, this technology was going to become a reality. Nothing could stop that. And ironically enough, the safest hands for it to be in might just be the general public with the power to desensitize it, rather than an exclusive few, with the power to exploit it.
Thank you.