Thanks for the reply, but you are exactly the audience my post is for. Because you say that, we will lose what little figments of privacy and freedoms we have left.
Apple tried and made good progress. They had bugs which could be resolved but your insistence that it couldn't be done caused too much of an uproar.
You can have a system that flags illicit content with some confidence level and have a human review that content. You can make any model or heuristic used is publicly logged and audited. You can anonymously flag that content to reviewers, and when deemed as actually illicit by a human, the hash or some other signature of the content can be published globally to reveal the devices and owners of those devices. You can presume innocence (such as a parent taking a pic of their kids bathing) and question suspects discretely without an arrest. You can require cops to build multiple sufficient points of independently corroborated evidence before arresting people.
These are just some of the things that are possible that I came up with in the last minute of typing this post. Better and more well thought out solutions can be developed if taken seriously and funded well.
However, your response of "Yes." is materially false, law makers will catch on to that and discredit anything the privacy community has been advocating. Even simple heuristics that isn't using ML models can have a higher "true positive" rate of identifying criminal activity than eye witness testimony, which is used to convict people of serious crimes. And I suspect, you meant security, not privacy. Because as I mentioned, for privacy, humans can review before a decision is made to search for the confirmed content across devices.
> Because you say that, we will lose what little figments of privacy and freedoms we have left.
I understand that you seem to think that adding systems like this will placate governments around the world but that is not the case. We have already conceded far more than we ever should have to government surveillance for a false sense of security.
> You can have a system that flags illicit content with some confidence level and have a human review that content. You can make any model or heuristic used is publicly logged and audited. You can anonymously flag that content to reviewers, and when deemed as actually illicit by a human, the hash or some other signature of the content can be published globally to reveal the devices and owners of those devices. You can presume innocence (such as a parent taking a pic of their kids bathing) and question suspects discretely without an arrest. You can require cops to build multiple sufficient points of independently corroborated evidence before arresting people.
What about this is privacy preserving?
> However, your response of "Yes." is materially false, law makers will catch on to that and discredit anything the privacy community has been advocating. Even simple heuristics that isn't using ML models can have a higher "true positive" rate of identifying criminal activity than eye witness testimony, which is used to convict people of serious crimes. And I suspect, you meant security, not privacy. Because as I mentioned, for privacy, humans can review before a decision is made to search for the confirmed content across devices.
It's not "materially false." Bringing a human into the picture doesn't do anything to preserve privacy. If, like in your example, a parent's family photos with their children flag the system, you have already violated the person's privacy without just cause, regardless of whether the people reviewing it can identify the person or not.
You cannot have a system that is scanning everyone's stuff indiscriminately and have it not be a violation of privacy. There is a reason why there is a process where law enforcement must get permission from the courts to search and/or surveil suspects - it is supposed to be a protection against abuse.
> I understand that you seem to think that adding systems like this will placate governments around the world but that is not the case. We have already conceded far more than we ever should have to government surveillance for a false sense of security.
You have an ideological approach instead of a practical one. It isn't governments that are demanding it. I am demanding it of our government, I and the majority. I don't want freedoms paid for by such intolerable and abhorrent levels of ongoing injustice. It isn't a false sense of security, for the victims it is very real. Most criminals are not sophisticated. Crime prevention is always about making it difficult to do crime, not waving a magic wand and making crime go away. I'm not saying let's give up freedoms, but if your stance is there is no other way, then freedoms have to go away. But my stance is that the technology is there, it's just slippery slope fallacy thinking that's preventing from getting it implemented.
> What about this is privacy preserving?
Persons aren't identified before a human reviews and confirms that the material is illicit.
You have to identify yourself to the government to drive and place a license plate connected to you at all times on your car. You have to id yourself in most countries to get a mobile phone sim card, or open a bank account. Dragnet surveillance is what I agree is unacceptable except as a last resort, it isn't dragnet if algorithms flag it first, and it isn't privacy invading if false hits are never associated with individuals.
> you have already violated the person's privacy without just cause, regardless of whether the people reviewing it can identify the person or not.
There is just cause, the material was flagged as illicit. In legal terms, it is called probable cause. If a cop hears what sounds like a gunshot in your home, he doesn't need a warrant, he can break in immediately and investigate because it counts as extenuating circumstance. The algorithms flagging content are the gunshots in this case. You could be naked in your house and it will be a violation of privacy, but acceptable by law. If you said after review, they should get a warrant from a judge I'm all for it.
It is materially false, because that the scanning can be done without sending a single byte of the device. The privacy intrusion happens not at the time of scanning, but at the time of verification. To continue my example, the cop could have heard you playing with firecrackers, you didn't do anything wrong but your door is now broken and you were probably naked too, which means privacy violated. This is acceptable by society already.
The false positive rates for cops seeing/hearing things, and for eyewitness testimony is very high in case you're not aware. by comparison, apples csam scanner was very low.
> There is a reason why there is a process where law enforcement must get permission from the courts to search and/or surveil suspects
As stated above, so long as the scanning is happening strictly on-device, you're not being surveilled. When there is a hit, humans can review the probable cause, a judge can issue a warrant for your arrest or a search warrant to access your device.
Another solution might be to scan only at transmission time of the content, not capture and storage (still not good enough, but this is the sort of conversation we need, not plugging in of ears).
Let's take a step back. Another solution might be to restrict every content publishing on the internet to people positively identifying themselves.
> You have an ideological approach instead of a practical one.
It's both. We can save a whole lot of time and money not wasting resources on security theater and reallocate it towards efforts that actually make society better and safer.
> It isn't governments that are demanding it. I am demanding it of our government, I and the majority.
> I don't want freedoms paid for by such intolerable and abhorrent levels of ongoing injustice. It isn't a false sense of security, for the victims it is very real.
No, it still is a very false sense of security. Intercepting illicit material online doesn't actually stop the crime from being committed nor does it dissuade people from distributing it.
> Most criminals are not sophisticated. Crime prevention is always about making it difficult to do crime, not waving a magic wand and making crime go away.
Sure, but the 'criminals' that are distributing illicit material online are already going to lengths, sometimes very technical, to distribute it anonymously.
> I'm not saying let's give up freedoms, but if your stance is there is no other way, then freedoms have to go away.
You are saying let's give up freedoms. Let's drop any sort of notion that you care about freedom because you do not. I'm not saying that it's an invalid world view; your reasoning for wanting to eradicate those freedoms is rational and with good intention, but you are begging for authoritarianism none the less.
I don't think there's any sort of agreement to be had here. Fundamentally I cannot agree with the notion that everyone must concede their personal liberties and privacy in order to capture a few more stupid criminals.
> But my stance is that the technology is there, it's just slippery slope fallacy thinking that's preventing from getting it implemented.
No, it's actually just a slipper slope. There is no fallaciousness in the logic here because we've already witnessed the erosion of our rights for this purpose over and over again and they continue to push for even more degradation of those rights.
> Persons aren't identified before a human reviews and confirms that the material is illicit.
This is already a violation of privacy. Share all of your personal photos with hacker news if you disagree. We don't know who you are, after all, so it's not a violation of your privacy, right?
> There is just cause, the material was flagged as illicit. In legal terms, it is called probable cause. If a cop hears what sounds like a gunshot in your home, he doesn't need a warrant, he can break in immediately and investigate because it counts as extenuating circumstance. The algorithms flagging content are the gunshots in this case. You could be naked in your house and it will be a violation of privacy, but acceptable by law. If you said after review, they should get a warrant from a judge I'm all for it.
In legal terms, probable cause is what you need to make an arrest or before obtaining a search warrant. The "gunshot" exception isn't probable cause. It's an emergency exception that allows for a warrantless search because there is an independent, externally observable signal of imminent harm i.e. an emergency situation.
The algorithms are not the 'gunshot' here. It is not searching in response to some sort of external signal like a gunshot or hearing someone screaming or even seeing someone getting attacked. It is the search itself - it only produces a flag marking someone as suspicious because it has already examined someone's private files. The "probable cause" was produced by conducting the search. That is backwards.
It is equivalent, in your analogy, to a cop opening every front door in the neighborhood to look inside and then saying they now have probable cause because they saw something suspicious. The search already happened.
> It is materially false, because that the scanning can be done without sending a single byte of the device. The privacy intrusion happens not at the time of scanning, but at the time of verification.
You do not need to transmit information for it to be violation of privacy. If a cop opens your filing cabinet, looks through your folders, and leaves everything exactly where he found it, he's still already intruded by examining your private material.
The suspicion of criminal activity must precede the search. Simply possessing digital files isn't a basis for individual suspicion - you are treating everyone as a suspect that deserves no protection.
> To continue my example, the cop could have heard you playing with firecrackers, you didn't do anything wrong but your door is now broken and you were probably naked too, which means privacy violated. This is acceptable by society already.
Society accepts warrantless entry only when there is an actual emergency - The reason a gunshot or firecrackers can justify it is because they are external signals - they do not require the police officer to enter the home in order to detect it.
Society does not accept random entries just to look for problems.
And just to get ahead of it, a machine performing the search doesn’t change anything. A search is defined by what’s being examined, not who (or what) is doing the examining. If the government sent a robot into your home that didn’t know your name and only alerted authorities if it found something illegal, it would still be a search. The fact that it’s automated doesn’t make it any less of an intrusion.
Let me post a longer reply later. But for your last point, we do have automated machine generated alarms in form of smoke detectors. We're legally required to have them in our homes. Firefighters might do the breaking in though, instead of cops, but still agents of the government. Is it more accurate to treat internet access, same as public road access? it's a regulated privileged, instead of a right. You can't taint your windows too much so cops can look in for example when driving on public roads.
With this logic, you could justify embedding cameras in every private space of someone’s home. The feed could be sent to a server running an automatic algorithm that flags potential crimes. If something suspicious appears, authorities would be alerted and an independent review would determine whether a crime occurred.
I have no doubt in my mind if we did that it would certainly be a huge win for law enforcement, detecting crimes and gathering evidence to help catch criminals.
Why stop there, though? Why not require everyone live in glass apartments like in the novel We?
These aren't big leaps from what you're proposing. You are advocating for mass surveillance with the assumption that these systems won't be abused despite countless examples of surveillance being misused by those in power.
Comparing scanning all of someone's digital files to smoke detectors is absurd.
You have a good point, but is a phone equal to your private home, or is it similar to a car (where you are required to have transparent glass windows). Is it a right or a privilege?
But to challenge your argument further, if the majority are fine with having cameras in their homes that don't transmit unless a crime is detected, isn't that just democracy?
What's getting lost in this discussion might be the fact that the majority of people don't care that much about privacy, especially when heinous crimes are involved. Furthermore, the equivalent would be house builders installing cameras in homes, not home owners being required to install one. But a reasonable compromise might be scanning content being transmitted instead of stored?
> You have a good point, but is a phone equal to your private home, or is it similar to a car (where you are required to have transparent glass windows). Is it a right or a privilege?
We regulate the operation of motor vehicles because they pose an immediate safety risk. As in, the use of one could reasonably result in injury or death. A phone is not something you could reasonably expect to be used to create immediate harm (injury, death) and you wouldn't regulate one as such. That's not to say that aspects of it can't be regulated, but the fact that it can be a tool used to generate harm does not make it itself particularly dangerous.
> But to challenge your argument further, if the majority are fine with having cameras in their homes that don't transmit unless a crime is detected, isn't that just democracy?
Yes, which is why we avoid direct democracy pretty much everywhere in the world. But rights aren't something that can be taken away by a vote. Only protections against a government violating your rights can. If you could vote away your rights then pretty much every authoritarian government would be wholly justified in their abusive actions.
> What's getting lost in this discussion might be the fact that the majority of people don't care that much about privacy, especially when heinous crimes are involved. Furthermore, the equivalent would be house builders installing cameras in homes, not home owners being required to install one. But a reasonable compromise might be scanning content being transmitted instead of stored?
Most people don't care about a lot of things. That's another reason why we don't have most people writing legislation. There are tons of things I have extremely limited knowledge about that someone else feels very strongly about and vice versa. The majority of people feeling apathetic towards something isn't an indicator that the majority is correct.
> We regulate the operation of motor vehicles because they pose an immediate safety risk.
That's not the legal reasoning as i recall. it is because they use public roads. They are just as unsafe when you drive them in a racing circuit or on your ranch, but traffic laws only apply on public roads. Same with your post mail being scanned and searched, or your baggage at airlines, it isn't just for safety and no warrant is needed, they look for contrabands,customs violations,etc.. too. It is because you are engaging in a privileged activity.
> Yes, which is why we avoid direct democracy pretty much everywhere in the world.
News to me, i thought it was because of practicality. I think you mean pluralistic?
> If you could vote away your rights then pretty much every authoritarian government would be wholly justified in their abusive actions.
Maybe a clear definition of digital rights is what is missing? But explain to me why your right to privacy is more important than the rights of victims. If victimization was rare, that would be one argument, but it is frequent, and something can be done to reduce it. From what I understand, the scanning methods Apple proposed are differential, your privacy won't be violated unless there is a match.
Going back to my earlier point, you have rights. But those rights can only be protected by the government so long as the security of its people remains in tact. Every right we have is taken way when it comes to "national security risk" for example. Is a potential terrorist attack any worse in terms of security compared to the very real impact of CSAM against the most innocent members of society? If there was a terrorist attack impending and the only way to stop it is by scanning everyone's phones, guess what? it is already the law that the government can do that.
> Most people don't care about a lot of things. That's another reason why we don't have most people writing legislation. There are tons of things I have extremely limited knowledge about that someone else feels very strongly about and vice versa. The majority of people feeling apathetic towards something isn't an indicator that the majority is correct.
They don't write legislation, but they determine what legislation gets written. They vote based on promises of legislation, they may not care about details but they care about outcomes. In this case "not caring" is for that, outcomes, not the technicalities of legislation. As a matter of policy the voters don't care. And law makers have a duty to reflect the sentiment of their constituents.
Even it comes down to taking away the rights of the minority voters, it may not be as simple legislation, but constitutional amendments exist and it all comes down to how many people want that change. We could literally have something insane like slavery back again within a year given enough popular sentiment.
The patriot act has been getting renewed since its inception, now almost a quarter of a century ago, across multiple administrations, and with bi-partisan support. that is the will of the people in effect.
Except that it is not materially false. Only in a perfect society will your “system that flags illicit content” not become a system that flags whatever some authoritarian regime considers threatening, and subverting public logging/auditing is similarly trivial to a motivated authoritarian. All your hypothetical solutions rely on humans, who are notoriously susceptible to being influenced by either money or being beaten with pipes, and on corporations, who are notoriously susceptible to being influenced by things that influence their stock price.
The Pleyel’s corollary to Murphy’s law is that all compromises to individuals’ rights made for the sake of security will eventually be used to further deprive them of those rights.
(I especially liked the line “You can require cops to build multiple sufficient points of independently corroborated evidence before arresting people.”)
This is already the case with other means of communication. the internet isn't that special. If you don't trust your government, do something else about it.
We rely on eye witness testimony and human juries all the time. The innocence project has a long list of people that spent decades in prison because of this.
The solution to authoritarian regimes is to not have one, not tolerate cp on the internet.
> The solution to authoritarian regimes is to not have one
The solution to not being poor is being rich. You could apply that logic to a lot of things. Have this thing instead of that thing. Using your example above of "differential privacy scanning"
Differential privacy is a property of a dataset meaning you can’t tell an individual was part of a dataset. If it’s traceable back to the individual device it’s not differentially private.
I think at this point you're just trying to say "don't have this thing have that thing instead" as a response to anything.
Certainly you can. The solution to being poor is not being poor. how? that is a different story, but ultimately, the solution to being poor must be not being poor, otherwise it isn't a solution right? And of course it is a reductive take, but it is nevertheless correct. Solutions that don't result in poor people no longer being poor are not solutions. Solutions that don't involve in not having an authoritarian regime are not solutions to that problem either.
Your solution to authoritarian regimes is not fighting CSAM, you made the CSAM problem worse, and it does not prevent authoritarian regimes. An authoritarian regime does not need your permission to scan your phone. And most human governments in history qualify as authoritarian, and they didn't need phones let along scanning of phones.
> I think at this point you're just trying to say "don't have this thing have that thing instead" as a response to anything.
I'm saying: "If you don't like apples, don't eat apples. Don't talk about how we need to kill all the bees and worms that help apple trees reproduce".
> Differential privacy is a property of a dataset meaning you can’t tell an individual was part of a dataset.
Yeah, that's correct. And that's a violation of individual's privacy..how?
What would it take for you to consider scanning of phones a valid solution. Would mass murder, global nuclear war, pandemic containment? Is it a question of not understanding the harm being done? My frustration is that, ok, let's not scan phones. what's your solution? You have none. Your solution is to do nothing and accept things should be the way they are. If I said let's verify everyone's ID before they can access the internet, is that acceptable? Let's ban Tor and VPNs instead, is that acceptable? What is your solution? Can you at least agree what we should aggresively be working on a solution? We have people training LLMs to generat CSAM and you hear not a peep out of all these companies and devs working on the tech. Just slap knees and declare "welp, that's unfortunate".
I don't care what governments do. If it takes an authoritarian regime to stop this insanity, I'm all for it. I'll be royally screwed, it will be a nightmware. But if that is the cost, so be it. This is how authoritarians gain power by the way. You have the apathetic educated and ruling classes, and the masses crying for change, and they will actually solve the problem but destroy everything else along the way. I'm tell you that if I, someone who is relatively aware and informed of the risks of privacy loss, of tech underlying the systems we use, if I am saying this, imagine what the majority of people would say.
it took one 9/11 attack to get us the patriot act, if someone used Tor on their rooted android phone to do something worse, phone scanning will be the least of your concerns. And the public would support it. You need a solution because the public demands it, at the cost of privacy if required. But it is for technologists to device a mechanism that solves the problem without costing us privacy.
Yes. You cannot have a system that positively associates illicit content with an owner while preserving privacy.