More precisely, she wants the distribution of nudes without consent, real or faked, to be considered an offense. This isn't about the tool, but what's being done with it.
That doesn't seem like an accurate summary. The actual quotes address the generation of those images, not just the distribution:
> "Parliament needs to have the opportunity to debate whether nude and sexually explicit images generated digitally without consent should be outlawed, and I believe if this were to happen the law would change."
And it's about restricting the tools, not just what's done with them:
> "If software providers develop this technology, they are complicit in a very serious crime and should be required to design their products to stop this happening."
>> "If software providers develop this technology, they are complicit in a very serious crime and should be required to design their products to stop this happening."
so we should outlaw any piece of technology that can be misused?
This is what happens when technologically inept people make laws to regulate technology. By this standard, cameras should be outlawed in case they're used to take sexually explicit pictures without consent. Computers should be outlawed in case they're used to distribute anything illicit.
Outlawing the production (by any means) and/or distribution (by any means) of non-consensual, sexually explicit material should be enough. If someone can show evidence that someone has produced to shared such content, then that's enough to demonstrate the offence.
No need to go stopping the rest of us from using technology just because some disgusting excuse for a human decided to demonstrate some of the worst of humanity with said technology.
I don't think this applies. Cameras are general tools that have many useful purposes. So are computers.
This is specifically an AI tool that creates fake nudes. Whatever useful, appropriate purposes for this tool (if any) exist - they pale in comparison to your examples.
The quote was about the technology in general, not the specific application.
The technology has plenty of applications in CGI and trying to push through the uncanny valley. By no means am I claiming that a tool designed to generate sexually explicit material without consent is an appropriate use. But the fact is, technology is used and abused every day without having such huge sanctions put on it. What makes this any different? the fact it's new.
But it's not a new trend by any means. 20 years ago, 'x-raying' was a thing. Photoshop someone to look like their clothes are transparent. Or the bubbles effect, using circles cut out of a mask layer to hide clothing to make the image look like a nude without showing anything explicit.
The fact is outlawing technology just means something new will turn up to replace the old. Whereas making the end product illegal to possess and/or distribute gives a far more beneficial power to law enforcement and the legal system. "I'm sorry officer, these were generated using a different tool, so they're actually legal" or "you can't prove what tool was used to create them" would be perfectly viable defences if the tool is outlawed, vs the act itself.
> should be required to design their products to stop this happening.
How?
How are you going to stop people from sharing images of what your software produced?
There might be more context to this quote but is insane how out of touch lawmakers often appear to be with respect to technology. Just look at all of the cookie banners plastered with dark patterns which completely nullify the idea behind them.
How do we create working legislation for technology?
> Just look at all of the cookie banners plastered with dark patterns
That's thanks to webmasters who want to track visitors on the first visit, instead, of, say, having an opt-in link somewhere in the footer. It's amazing how much FUD is spread about the GDPR, and it's ironic in context of accusing lawmakers of cluelessnes, who in this case demonstrated more clue than millions of people apparently have about their own job.
Digitally is/has become synonymous with "on a computer" or "algorithmically" because we do not have any non-digital computers to which we refer to as computers.
Yes, but it shouldn't matter whether I make pictures of others on my PC or by hand if the other didn't consent to me making those pictures. Regardless of the type of picture.
A definition of photograph is not provided in the text of the act, but I would guess "appears to show" covers this. There will probably have to be a test case. That is, if it looks like or is claimed to be a photograph it's covered.
A real, light-capture photograph can be of an innocent situation appears compromising. Angles, hidden partitions, even a hand temporarily over the wrong place.
The "appears to show" applies to the act captured, not the medium, AFAICS.
This is an important difference, and, for me at least, the difference between disagreeing (“ban the tool”) and agreeing (“make non-consensual nude distribution illegal”) with the proposal.
It’s a shame the headline communicates the former when it seems the proposal is the latter.
> It’s a shame the headline communicates the former when it seems the proposal is the latter.
Because a headline blubbing about AI-X-ray-specs grabs attention. Making people more likely to click the link.
You can see the effect in action right here: the link made it to the top of the front page in less than an hour of being submitted.
Besides that tough, the article does contain the typical misunderstanding of politicians about how software works ("software developers should be forced make $SOFTWARE so that it can't do $MY_SPECIAL_EXCEPTION_CASE"), which is problematic per se, unrelated to any machine learning aspect.
It seems right to me that you should not be able to publish something that looks like a recording of person X doing/saying/participating in something, without either getting the consent of person X, or making it very, very clear in the published material that it is a fake. (I'm not claiming this is how the law currently works; I'm not an expert on that)
For nudity and pornography I think there is an added twist that even if the viewer knows that it's fake, there is an element of violation of the real person there. I'm not quite sure where I would draw the line about that.
If you're hurting a real person by making naked pictures of them against their wishes, that should be what the law prevents. No matter what tricksy means you do it with. Whether it's photo manipulation, AI, or really skilled photorealistic oil painting, is immaterial.
From the person who appears to be depicted on it. Not that different from an image being a picture painted by an artist from memory/imagination, honestly. In the end, it basically boils down to having the rights on your appearance.
Yes, but similarity is not uniquely owned... 2 people may have resemblance and one may have given consent. Does the other have a right to demand consent too?
Hypothetical example: A person who happens to look like Boris Johnson (PM of UK) consents to a nude depiction. Does the image require the consent of all other people who have resemblance to Boris Johnson, including Boris Johnson but also extending to people elsewhere on Earth that don't know him but definitely look like him?
If yes... then you have lost control of right on your appearance because now the rights are collectively owned by anyone with a resemblance.
I wonder whether saying that you based your nude on Joe Smith who you got consent from, would count as a defense? Even if Joe Smith happens to look like some celebrity that you didn't get permission from.
I think this should only work if you could reasonably be assumed to be unaware of the existence of any look-a-likes (including the celebrity) you don't have a permission from.
You wouldn't even need to be clever. Use a bunch of stock photos mixed in with the target photo, use a Co-pilot-level GAN to "sort" the bodily features of this photo in a way that would suit your liking and - voila!- "It's an algorithmic choice. It can't be helped". That or anime-ification.
But none of this would be needed if people respected freedom of speech. No person should be obligated to abstain from composing the digital equivalent of a nude statue because it can resemble a living person. One shouldn't even be a need to ask what happens when satire and sexuality are policed. Anyone who has been reading the news over the past few decades would know the answer to that.
Even if a "malicious" intent were relevant to the production or distribution of deepnudes being considered a crime (which I am convinced is not a crime under any circumstances), judges and lawmakers have historically shied away from arguments of intent in the past under the argument that it's too difficult and time consuming. Present laws regarding "revenge" porn , for example, are assessed under strict liability and don't require any proof of any actual revenge plot being involved. There was a case in Illinois [1] in which the defendant was trying to prove her ex-boyfriend's infidelity by distributing photos he had carelessly synchronized to her iCloud account. In doing so she was tried with distribution of non-consensual pornography without any consideration for intent. Her legal team appealed on 1st amendement grounds where the Illinois law did not apply strict scrutiny standard required for rulings that curtail free speech on the supposed basis of serving a compelling government interest. The appeals were eventually filed all the way to the Supreme Court but the case was not taken up.
If a deepnude ban were to happen, I don't expect the arguments and legal standards under which such a ban is judged to be any different. That's what I find troubling.
Are these images 100% generated, or are they generated from a "seed" image with the express intention to emulate a likeness with that original image such that the generated image is intended to be indistinguishable from the original subject of the original image? (Note: art is not under the microscope here, a computer program is.)
I think your second sentence trips back into the distinction the parent comment is making. When people post themselves nude, no one cares (barring some specific contexts); but, if someone posts a picture of you nude without your consent, then you probably care and consider it a violation of your privacy. The question of whether nudity is permitted is not really at question. What is, is whether computer programs that generate nude images for express purposes of making the real or generated image indistinguishable from a self-posted image is a violation of consent and privacy laws.
> If an image is 100% generated, from whom should consent be obtained?
This isn't talking about 100% generated content. It is talking about instances where apps (either automated examples like the one named or photo editing suites) are used to manipulate an image of a real person to make it look like they are naked.
Would you want a convincing an image of yourself in your birthday suit out there?
Or to take it a bit further, if that image that makes it look like you attended a party with Trump & Epstein in such a state?
Even if it were possible to argue no privacy had been invaded because that isn't really the person's naked form, there is the potential reputational damage to consider, both professional and personal.
Obviously fake nudes have been a thing for a long time, even convincing ones, but the issue was limited by an amount of time and/or skill being required. The newer tech available in certain apps today makes it very easy to make truly convincing images.
I think outlawing this tool would be counter-productive.
One of the reasons nudes being released is damaging is because it's a rare enough and noteworthy event. If because of this tool everyone has nudes of them floating around then it would become a normal thing and would actually remove most of the damage around real nudes leaking by providing plausible deniability (assuming anyone ever cares at this point - if the world is drowning in nudes of everyone, the real thing will probably go unnoticed anyway).
Outlawing the tool wouldn't actually stop malicious usage of it but because only criminals would use it it would make its (rarer) usage more damaging than if anyone could legally use such a tool and nudes stop being a noteworthy event.
In America, the federal child pornography law applies only to depictions of an actual child (and you have to know it, for possession offenses, though that’s another matter). But the Justice Department has long taken the position that an image of a clothed child that’s altered to then make the child look nude—-they used to call these “morphed” images—-counts. I don’t think it’s ever been definitively resolved by the Supreme Court, and I don’t know what the courts of appeals have said, but tools like DeepSukebe have made that argument way more appealing. I’d bet that this is where regulation will begin: images of children. That has always been a domain where American courts have been extremely reluctant to intervene; for example, any visual depiction of a seventeen year old engaged in sex is proscribable without resort to the ordinary inquiry into whether the work as a whole is “obscene,” etc.
But under reigning American First Amendment law, it gets a lot harder to explain why a law like the one being proposed here would be acceptable. The Supreme Court has, for example, held that the distribution of animal-cruelty videos cannot be forbidden. And it’s not clear to me how one could proscribe the distribution of an imaginary visual depiction of an adult who was nude. You could call it defamatory, I suppose, but if it’s concededly fictional… I don’t know.
Actually SCOTUS has ruled that laws banning obscene material are permitted under the 1st amendment.
There is 3 part test.
The SCOTUS actually heard appeals on the "obscenity" of material on a case by case basis for a while decades back.
More specific to this case is the PROTECT act [1]. I don't know whether it's every been ruled against or whether SCOTUS has accepted that all depictions of minors are obscene...
I mean, for decades you couldn't swear on TV (and still can't), so I feel like there's a "technically legal to possess, but illegal to send on the internet" avenue.
> And we're talking about the UK with no right of free speech
Come on now, no need to exaggerate. The UK has limits on free speech, as do most countries ( even the holier than thou US, if I'm not mistaken). You might consider them to be too much, but that doesn't mean there's no right of free speech.
In the example you're quoting, you're aware that the restrictions were on the political wing of what was basically a terrorist organisation ( IRA) with which the UK was in what was basically a (civil) war?
I mean there's no encoded, constitutional right that cannot be superceded by ordinary legislation. There is the ECHR right to free speech, which has qualifications, but there is constant muttering from the Tory party about repealing ECHR. Which can be done by ordinary legislation.
> In the example you're quoting, you're aware that the restrictions were on the political wing of what was basically a terrorist organisation ( IRA) with which the UK was in what was basically a (civil) war?
> I mean there's no encoded, constitutional right that cannot be superceded by ordinary legislation. There is the ECHR right to free speech, which has qualifications, but there is constant muttering from the Tory party about repealing ECHR. Which can be done by ordinary legislation.
So there is a right, which might get repealed some day by ordinary legislation with ordinary quorum. But it's still there.
> Yes, I literally said that in my comment.
There's a slight difference between "links to a terrorist organisation", which is pretty vague and can mean lots of things, to literally the political wing of a terrorist group.
He wasn’t trying to blow stuff up in the US, though.
It’s also inaccurate to describe Gerry Adams as an “opposition politician”. The opposition in British politics is the largest party opposing the government, and has a semi-official status. Adams was not in the Labour party and never even attended Parliament. So to British ears, it’s very misleading to say that there was a TV ban on an opposition politician.
Chris Morris did an interesting satirical piece on this in Brass Eye where he placed photos of children alongside slightly less savory material and asked the former head of the Obscene Publications Branch if they were illegal or not: https://youtu.be/RcU7FaEEzNU?t=1016
It will be interesting to see how this kind of thing plays out. I’m sure it’s quite distressing if a tool like this is used on your photo and then potentially shared in your friendship group. Hopefully we very quickly get to the point that nobody will ever be able to know if a photo is real or fake and it’s just not considered an issue therefore. Policing it seems like it would be extremely difficult. Maybe we police the intent? In other words you can produce the images but if you use it maliciously against a person then there is a crime.
> Hopefully we very quickly get to the point that nobody will ever be able to know if a photo is real or fake and it’s just not considered an issue therefore.
The ongoing "fake news" crisis proves that we are, in general, bad at spotting fakes - that is the purpose of a fake - and will also vehemently disagree about what is fake. Especially when it can be used for political purposes. Expect the first world leader brought down by a deepfake in the next decade.
I'm wondering if there are existing laws that would cover the intent part. Defamation and harassment laws etc. Maybe they just need amending. Trying to police the apps exact functionality directly seems counter productive.
I wonder as well, there have been people putting their friends or adversaries’ heads on nude model bodies ever since photoshop came out. Is that a crime and has it ever been prosecuted (maybe for defamation)?
I remember when deepfakes first was released there was a group who would deepfake coworkers, Facebook friends, etc for a really low cost (like $100) as long as the target had a few hundred public photos.
This is without consent as well, but it’s also not real. It seems like the equivalent of imagining people nude. Kind of creepy if I know it’s happening but not truly a violation of my privacy.
That’s a good point in that the distribution is different.
Maybe the better analogy for imagining nude would be to just make an image and not show it to anyone. So the issue is commercialization or distribution.
In the future when we have digital consciousnesses running on Google or Amazon or whatnot, will we be prevented from imagining people nude without their consent because the mind will be replicated across multiple availability zones?
There is a continuum there between harmless to deeply offensive. The exact location in the continuum will at the very least depend on the person being subjected to this treatment and the cultural context.
The "AI" aspect will amplify the offense because of how life-like the end result can be.
I'm guessing that won't help even if it becomes a reality. These days even proper paternity tests are useless if one end up on a birth certificate without his knowledge. You're still liable even if not guilty.
Edit: That being said... this naked faking app can cause of lot of problems in the workplace and home.
Yeah I think it's a great idea for a site/service much akin to the OP.
I imagine there would be a big draw just sticking in famous parents and kids to see if there is a match.
Depends if they care about commercial or non-commercial stuff. Banning commercial use of software is fairly easy (tell the gatekeepers to disallow it, or at the very least to let the police find who did it); banning non-commercial isn’t strictly impossible, but I recon to be effective it would need governments to treat international internet connections like any other international border, and they are not prepared for the side-effects of doing that.
Most people don’t grok computers, so a commercial ban would probably cover most people, yet Bush-bin-Laden photoshop I saw back in the early noughties would still get made and shared.
I believe Virgina has already passed legislation so broad and abstract as to make photoshop and other image editing tools illegal. Like most laws of this type it's not about stopping an immoral or illegal activity. It's about control of people in power's public image. It will only be enforced if you piss off someone rich or powerful.