Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

European here, sorry if I misunderstood, but I always thought the "free speech" protected by the first amendment only protects your right to e.g. voice your political opinion without being punished for your particular views, not your preferences in porn (or the distribution and even existence of it)? As an extreme counterexample the government banned CSAM and one couldn't weasel their way out of such a case by pleading first amendment, right?


That's a so complicated case, that it's an increasingly bad example.

CSAM is basically evidence of a crime, so it's treated as such, private entities are pressured to monitor for it, detect it, quarantine it, etc.

Which in some abstract sense is okay. (We have data on rising child abuse, including sexual, also data on rising CSAM on the internet and so on, so some very effective interest groups are trying to "do something".) Now, of course, it might not come as a complete surprise to you here on HN that the pragmatic concerns about the approach, proposals, implementation, lawcraft and whatnot involved are far outweigh even the wildest claims of benefits, the whole enterprise is very counter-productive, at best security-theater, but in reality it's just a fucking public interest disaster (FOSTA/SESTA, basically War on Sex Work, yet another highly acclaimed installment of the War On series!), an in a convenient secondary effect it's a "free moat" for the incumbents.

The obvious legal issue is that it's "impossible" to say what it CSAM and what is something that looks like that. For example, what if, there's a machine that generates pictures that look like CSAM? You might advertise it, but likely in no time the DoJ and a bunch of other federal and state entities would sue (and of course arrest, raid, detain) you arguing that somewhere there's actual CSAM in the process, it's distribution, etc.

And even if simple plain "aliens in a vacuum" reading of the law indicates that eventually the government would lose these cases, it's very realistic that the courts themselves would just make new law that in effect criminalizes this even more victimless version too.

But, all in all, with enough advocacy it's possible, just there are very few people who want to spend their life on this.


If AI generated CSAM is poisoned by actual CSAM in the process, wouldn't that extend to AI generated art being poisoned by copyrighted work in the process?


It's poisoned by the fact that it requires access to CSAM, which makes the operation illegal, even if the output might not be illegal.


Logically, but not necessarily legally.


It wasn’t that long ago that porn was banned in the US, and ‘obscene’ material still technically is. It’s generally allowed under a 1st amendment exception.

[https://en.m.wikipedia.org/wiki/Legal_objections_to_pornogra...]


CSAM is not in violation of the first amendment because the supreme court decided that obscenity isn't protected speech. It isn't specifically that the first amendment only protects political speech, like with abortion, obscenity limitations - particularly related to children - are just something the court made up which no one really openly disagrees with (for obvious reasons).

In a way it shows that the first amendment isn't really all that absolute. Say something that enough powerful people find gross and they'll just make it out to be an unstated exception.


It is actually freedom of expression, which covers speech, art, religion, etc. Porn/sex work can be any and all of those.


From what I understand, it even covers campaign contributions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: