Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Being able to pick what content they host is fundamental to freedom of speech for private entities

Interesting position - when somebody posts illegal content on YouTube, they are not liable, it’s not their speech.

But when I want to post something they don’t like, suddenly it’s their freedom of speech to remove it.

A lot of breakdown in society lately is clearly coming from the fact that some people/companies have it both ways when it suits them.



The solution would be to revoke section 203 from any platform which acts as a digital public square if they do moderation beyond removing illegal content.

Ofc they would try there best to be excluded to have there cake and eat it too.


The entire point of section 230 is to allow platforms to remove non-illegal content [1].

Basically there were two lawsuits about platforms showing content. One of the platfroms tried to curate content to create a family-friendly environment. The second platform just didn't take anything down. The first platform lost their lawsuit while the second won their lawsuit. Congress wants to allow platforms to create family friend environment online so section 230 was written.

[1]: https://en.wikipedia.org/wiki/Section_230#


If something like that were put in place, any platforms acting as a “public square” should also be required to disable all recommendation and content surfacing features aside from search, algorithmic or otherwise.

Those recommendation features already do plenty of damage even with platforms having the ability to remove anything they like. If platforms are restricted to only removing illegal content, that damage would quickly become much greater.


You need moderation for more than legality though, otherwise you can't have open forums like this, that aren't total cesspits.


Right:

* When a bot farm spams ads for erectile dysfunction pills into every comment thread on your blog... That's "legal content"!

* When your model-train hobbyist site is invaded by posters sharing swastikas and planning neo-nazi rallies, that too is "legal content"--at least outside Germany.

All sorts of deceptive, off-topic, and horribly offensive things are "legal content."


Sadly it turns out that the biggest driving force is politics, and the inability for our institutions to win with boring facts, against fast and loose engaging content.

The idea is that in a competitive marketplace of ideas, the better idea wins. The reality is that if you dont compete on accuracy, but compete on engagement, you can earn enough revenue to stay cash flow positive.

I would say as the cost of making content and publishing content went down, the competition for attention went up. The result is that expensive to produce information, cannot compete with cheap to produce content.


Your premise is incomplete. When someone posts illegal content on YouTube they are not liable if they are not aware of the illegality of that content. Once they learn that they are hosting illegal content they lose their safe harbor if they don't remove it.


Please don't post deliberately false information on HN.

https://www.law.cornell.edu/uscode/text/47/230


Let me rephrase, since saying they lose their safe harbor was a poor choice of words. The safe harbor does indeed prevent them from being treated as the publisher of the illegal content. However illegal content can incur liability for acts other than publishing or distributing and section 230's safe harbor won't protect them from that.


i find it hard to believe there is any content on YT platform that they are unaware of.


I mean what do you think happens? Do you think YouTube employs an army of people to watch and vet every single video that gets posted there?


no i think YT uses an AI to categorize and vet media based on standard rubrick, at a pace that exceeds a human collective by orders of magnitude.

they know about it as soon as you post it.


The reason we're having this discussion this on this particular post because YT's AI is not infallible. There isn't a "standard rubric" - just automated correlation-based scoring derived from labeled training data. In this case, the AI learned that media piracy and self-hosted setups are correlated, but without actual judgement or a sense of causality. So YT doesn't truly "know" anything about the videos despite the AI augmentation.

I am curious what you consider to be a "standard rubric" - would that be based on the presence of keywords, or requires a deeper understanding of meaning to be able to differentiate the study/analysis of a topic versus promoting said subject.


automated correlation-based scoring derived from labeled training data, would be the standard rubric


> Interesting position - when somebody posts illegal content on YouTube, they are not liable, it’s not their speech.

> But when I want to post something they don’t like, suddenly it’s their freedom of speech to remove it.

There is no contradiction there.

Imagine a forum about knitting. Someone, who has it in for the owners of this knitting forum (or perhaps even just a SPAM bot) starts posting illegal, or even just non-knitting content on this forum.

The entire purpose of the forum is to be a community about knitting.

Why is it the legal or moral responsibility of the knitting forum to host SPAM content? And why should they be legally liable for someone else posting content on their platform?

You're equating specific pieces of content with the platform as a whole.

There is no reality where I will accept that if I create something. I spend and risk my money on web hosting. I write the code. I put something out there... that other people get to dictate what content I have to distribute. That's an evil reality to contemplate. I don't want to live in that world. I certainly wont' do business under those terms.

You're effectively trying to give other people an ultimatum in order to extract value from them that you did not earn and have no claim to. You're saying that if they don't host content that they don't want to distribute that they should be legally liable for anything that anyone uploads.

The two don't connect at all. Anyone is, and should be free to create any kind of online service where they pick and choose what is or is not allowed. That shouldn't then subject them to criminal or civil liability because of how others decide to use that product or service.

Imagine if that weird concept were applied to offline things, like kitchen knives. A kitchen knife manufacturer is perfectly within their rights to say "This product is intended to be used for culinary purposes and no other. If we find out that you are using it to do other things, we will stop doing business with you forever." That doesn't then make them liable for people who use their product for other purposes.


This isn’t really what’s being argued. We’re not talking about a knitting forum. We’re talking about content neutral hosting platforms. There is a distinction in the law. If you want to not be liable for the content posted to your platform then you may not moderate or censor it seems like a fair compromise to me. Either you are knitting forum carefully cultivating your content and thus liable for what people see there, or you are a neutral hosting service provider. Right now we let people platforms be whichever favors their present goal or narrative without considering the impact such duplicity has on the public users.


> We’re talking about content neutral hosting platforms.

There is no such thing as a "content neutral hosting platform." I know that people like to talk about social media services in the same umbrella as the concept of "common carrier", which is reserved for things like mail service and telecommunications infrastructure. And that might be what you're conflating here. If you're not, then please point me to the law, in any country even, where "content neutral hosting platform" is a legal term defined.

> If you want to not be liable for the content posted to your platform then you may not moderate or censor it seems like a fair compromise to me.

Compensation for what? The "platform" built something themselves. They made it. They are offering it on the market. If anyone is due compensation, it is them. No matter how much you don't like them. You didn't build it. You could have, maybe. But you didn't. I bet you didn't even try. But they did. And they succeeded at it. So where does anyone get off demanding "compensation" from them just for bringing something useful valuable into existence?

That is a pretty messed up way of looking at things IMO. It is the mindset of a thief.

> Either you are knitting forum carefully cultivating your content and thus liable for what people see there,

Thank you for conceding my argument and shining a spotlight on how ridiculous this is. You agree that according to your world view, the knitting forum should be liable for the content others post on it just because they are enforcing that things stay on topic. Even just for removing SPAM bot posts this would expose them to this liability.

> Right now we let people platforms be whichever favors their present goal or narrative without considering the impact such duplicity has on the public users.

The beautiful thing about freedom is that along as people don't infringe upon the rights of others, they don't need your permission to just go build things and exist.

The YouTube creators didn't have to ask you to "allow" them to build something useful and valuable. They just went and did it. And that's how it should be.

I get that certain creators run into trouble with the TOS. Hell, I've tried to create an Instagram account on several occasions and it gets suspended before I can even use it. And when I appeal or try to ask "why?" I never get answers. It's frustrating.

But the difference between you and me, is I don't think that people who build and create things and bring valuable shit into existence owe me something just by virtue of their existence.


> The beautiful thing about freedom is that along as people don't infringe upon the rights of others, they don't need your permission to just go build things and exist

This is hollow sophistry, and it’s not how things actually are.

You don’t have freedom for Self dealing, price fixing, collusion, bribery, false marketing, antitrust violations, selling baby powder with lead and many other things.

In some states you can’t even legally collect rainwater.

Also the government will come after you with guns and throw you in jail if you violate some bogus and fictitious “intellectual property rights” that last for 70 years after creator has died.

It’s u helpful to pretend we live in Wild West of liberty


> You don’t have freedom for Self dealing, price fixing, collusion, bribery, false marketing, antitrust violations, selling baby powder with lead and many other things.

It's funny how often people will not read what you wrote, and instead read what they want to read.

Not only did my comment preempt that specific reply of yours in the very sentence you quoted, but you seem to have a warped working definition of the word "freedom": where you think that if someone uses it they mean "freedom to do literally whatever the hell they want to no matter who they hurt."

That means that your mental model of the word "freedom", at least when you hear others say it, begins with a straw-man.

No discussion is possible under those conditions.

I'll help you out: my personal operating definition of "liberty" is "An environment in which all interpersonal relations are consensual."

That's why, as long as you are not infringing upon the rights of others (the part of my quote that you just completely dropped and ignored so that you could react to what you wanted to read instead of what I actually wrote) you don't need the permission of others to build something. You can just go and do it.


> the concept of "common carrier"

So then, your actual opinion is Yes a "content neutral hosting platform." does exist?

Its seems very obvious here that people are saying that the laws that apply to common carriers could be changed so they apply to social media platforms.

Problem/confusion solved here, and the world doesn't fall apart. As we already have these laws, and the world didn't fall apart before.


> So then, your actual opinion is Yes a "content neutral hosting platform." does exist?

No. Common carrier and "hosting platform" are not the same thing. If someone wanted to apply common carrier status to broadband infrastructure, it might make sense. Applying common carrier to knitting forum does not. They are two very different things. One facilitates discrete communication between two distinct parties while the other publishes and distributes content to a wide audience. Conflating the two is an exercise in mental gymnastics that only makes sense if you have a political agenda and don't care about being intellectually honest.


I honestly don’t know what you are spewing off about. At one point you quote me saying “compromise” then proceed to argue as if I said “compensation”. I’m not going to respond to a mischaracterization.

To your challenge:

> In the United States, companies that offer web hosting services are shielded from liability for most content that customers or malicious users place on the websites they host. Section 230 of the Communications Decency Act, 47 U.S.C. § 230 (―Section 230‖). protects hosting providers from liability for content placed on these websites by their customers or other parties. The statute states that ―[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.‖ Most courts find that a web hosting provider qualifies as a ―provider‖ of an ―interactive computer service.‖

>Although this protection is usually applied to defamatory remarks, most federal circuits have interpreted Section 230 broadly, providing ―federal immunity to any cause of action that would make service providers liable for information originating with a third-party user of the service.‖

https://www.nist.gov/system/files/documents/itl/StopBadware_...

There is clear legal handling in the US beyond common carrier provisions for hosting providers on the internet.

The nuance here is an argument over what constitutes a hosting provider and how far we extent legal immunity.

My “worldview” is that if you want to claim your business is a hosting provider so that you are granted the legal protection from content liability, that you have a responsibility—which I’d argue we should codify more formally—to remain a neutral hosting provider in spirit, because it is in line with the type of liberty (freedom of expression) we aim to protect in the US. You are saying “legally I’m a neutral hosting provider”, and we already tolerate removal of spam and legally obscene/objectionable content so your point there is moot, so if you are making that claim legally then it’s two faced to turn around and say “IMA private entity I can do whatever I want to curate the content on my platform because I’m responsible for the brand and image and experience I want to cultivate in my house”.

I’m okay with hosting providers not being liable for user content, and I’m okay with yarn forums deleting any post that doesn't reference yarn. It’s the mix of both that I feel is partly responsible for the poor state we’re in now where users get demonetized on YT for questioning the efficacy of new vaccine technology.

Hopefully it’s clear what the nuance is here. And if you don’t think there’s a whole conversation that has been happening here read up on Cloudflare’s philosophy and what Prince has written about the topic. Because they were faced with the same dilemma with The Daily Stormer (but not quite as flagrant as Google/YT trying to play both sides for profit).


> There is no reality where I will accept that…

Welcome to the club

> if I create something. I spend and risk my money on web hosting. I write the code...

You can create a forum in 20 minutes, it’s all open source and I did that when I was 14

All the ‘risk’ and ‘writing code’ is about fighting other platforms for attention, not providing a consumer good.

> ultimatum… in order to extract value from them that you did not earn

I am the consumer, the market exists for me and I pay for the whole party. If a business that harms customers is called a crime syndicate.

You might see this ultimatum in other areas too, like “you can’t sell baby food with lead in it, or you go to prison”


The issue is that the knitting forum is a different beast from youtube. The latter is a platform. Its scale makes it QUALITATIVELY different. And there's network effects, there's dumping behaviour, there's preinstalls on every phone, there's integration with the ad behemoth, all to make sure it remains a platform.


This is correct. In the US tiktok is currently being sued for feeding kids choking game content through the algorithm that was earlier judged to be free speech.


Curation and promotion, even if done by a machine (LOL, why does that matter at all?) needs to come with significant liability.

It should be possible to protect content hosting services from extensive liability while not protecting companies from the consequences of what they choose to promote and present to people. Those are two separate and very different activities that aren't even necessarily connected (you could curate and promote without hosting, and in fact this happens all the time; you can host without curating and promoting, this also happens all the time—in fact, these typically are not mixed together outside of social media companies with their damned "algorithms", as far as content from 3rd parties goes)


> A lot of breakdown in society lately is clearly coming from the fact that some people/companies have it both ways when it suits them.

See how copyright is protected when it's whatcd violating it and when it's OpenAI




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: