It seems right to me that you should not be able to publish something that looks like a recording of person X doing/saying/participating in something, without either getting the consent of person X, or making it very, very clear in the published material that it is a fake. (I'm not claiming this is how the law currently works; I'm not an expert on that)
For nudity and pornography I think there is an added twist that even if the viewer knows that it's fake, there is an element of violation of the real person there. I'm not quite sure where I would draw the line about that.
If you're hurting a real person by making naked pictures of them against their wishes, that should be what the law prevents. No matter what tricksy means you do it with. Whether it's photo manipulation, AI, or really skilled photorealistic oil painting, is immaterial.
From the person who appears to be depicted on it. Not that different from an image being a picture painted by an artist from memory/imagination, honestly. In the end, it basically boils down to having the rights on your appearance.
Yes, but similarity is not uniquely owned... 2 people may have resemblance and one may have given consent. Does the other have a right to demand consent too?
Hypothetical example: A person who happens to look like Boris Johnson (PM of UK) consents to a nude depiction. Does the image require the consent of all other people who have resemblance to Boris Johnson, including Boris Johnson but also extending to people elsewhere on Earth that don't know him but definitely look like him?
If yes... then you have lost control of right on your appearance because now the rights are collectively owned by anyone with a resemblance.
I wonder whether saying that you based your nude on Joe Smith who you got consent from, would count as a defense? Even if Joe Smith happens to look like some celebrity that you didn't get permission from.
I think this should only work if you could reasonably be assumed to be unaware of the existence of any look-a-likes (including the celebrity) you don't have a permission from.
You wouldn't even need to be clever. Use a bunch of stock photos mixed in with the target photo, use a Co-pilot-level GAN to "sort" the bodily features of this photo in a way that would suit your liking and - voila!- "It's an algorithmic choice. It can't be helped". That or anime-ification.
But none of this would be needed if people respected freedom of speech. No person should be obligated to abstain from composing the digital equivalent of a nude statue because it can resemble a living person. One shouldn't even be a need to ask what happens when satire and sexuality are policed. Anyone who has been reading the news over the past few decades would know the answer to that.
Even if a "malicious" intent were relevant to the production or distribution of deepnudes being considered a crime (which I am convinced is not a crime under any circumstances), judges and lawmakers have historically shied away from arguments of intent in the past under the argument that it's too difficult and time consuming. Present laws regarding "revenge" porn , for example, are assessed under strict liability and don't require any proof of any actual revenge plot being involved. There was a case in Illinois [1] in which the defendant was trying to prove her ex-boyfriend's infidelity by distributing photos he had carelessly synchronized to her iCloud account. In doing so she was tried with distribution of non-consensual pornography without any consideration for intent. Her legal team appealed on 1st amendement grounds where the Illinois law did not apply strict scrutiny standard required for rulings that curtail free speech on the supposed basis of serving a compelling government interest. The appeals were eventually filed all the way to the Supreme Court but the case was not taken up.
If a deepnude ban were to happen, I don't expect the arguments and legal standards under which such a ban is judged to be any different. That's what I find troubling.
Are these images 100% generated, or are they generated from a "seed" image with the express intention to emulate a likeness with that original image such that the generated image is intended to be indistinguishable from the original subject of the original image? (Note: art is not under the microscope here, a computer program is.)
I think your second sentence trips back into the distinction the parent comment is making. When people post themselves nude, no one cares (barring some specific contexts); but, if someone posts a picture of you nude without your consent, then you probably care and consider it a violation of your privacy. The question of whether nudity is permitted is not really at question. What is, is whether computer programs that generate nude images for express purposes of making the real or generated image indistinguishable from a self-posted image is a violation of consent and privacy laws.
> If an image is 100% generated, from whom should consent be obtained?
This isn't talking about 100% generated content. It is talking about instances where apps (either automated examples like the one named or photo editing suites) are used to manipulate an image of a real person to make it look like they are naked.
Would you want a convincing an image of yourself in your birthday suit out there?
Or to take it a bit further, if that image that makes it look like you attended a party with Trump & Epstein in such a state?
Even if it were possible to argue no privacy had been invaded because that isn't really the person's naked form, there is the potential reputational damage to consider, both professional and personal.
Obviously fake nudes have been a thing for a long time, even convincing ones, but the issue was limited by an amount of time and/or skill being required. The newer tech available in certain apps today makes it very easy to make truly convincing images.
Especially if "this image is similar to me" is factored in, what degree of similarity makes an image a representation of a real person?
We're not far from this being less about privacy and dignity, and being more about whether the idea of nudity is permitted.