Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why can’t algorithms do the police work for some crimes?


Seriously? For one, this is basically a warrantless search which is illegal (obviously they get around this because Apple is a private company). Also, trusting algorithms for critical things like this is beyond absurd.


I still don’t understand the reaction. People get tied up in a knot over this but it is an effective deterrent to child pornography full stop. Even if it rubs you the wrong way to have software fingerprinting your files, I really don’t care if it means placing a deterrent in place against child trafficking.

Let’s walk through this.

1. Criminal kidnaps child and abuses him. 2. Criminal produced video of said abuse and sells it on the web. 3. Criminal continues to sell it and it spreads. 4. The video is detected by authorities who promptly add it to database. 5. Video is cryptographically hashed and now anybody who stores this content in iCloud can be identified 6. A customer of the criminal is caught 7. Forensics leads authorities to criminal who produced the video 8. One less criminal to profit from kidnapping and abusing children

Everyone tries to make this approach as a slippery slope to facial. It doesn’t have to be that way if the right people are in the loop to blow the whistle.


> I still don’t understand the reaction. People get tied up in a knot over this but it is an effective deterrent to child pornography full stop. Even if it rubs you the wrong way to have software fingerprinting your files, I really don’t care if it means placing a deterrent in place against child trafficking.

Most reductions of privacy toward the police would act as an effective deterrent to that crime and other crimes.

Deterring crime is not enough to justify a reduction in privacy.


Hypotheticals: What if your platform is causing a growth in the crime rate? What if your platform is enabling new forms of crime and potentially on an unprecedented scale? Is it justified then?


For the first one it depends on how much growth and in what crimes specifically.

For the second one, maybe, but I seriously doubt icloud encryption is going to do that.

Either way, just talking about baselines and percents is a good improvement over just "this would decrease crime". Add in the downsides too and you have yourself a good platform for discussion!


How about a counterexample? I am a consenting adult in my thirties. I create a photo or video and send it to my partner. The algorithm flags it as CSAM when it only shows a fat bald guy. Before I know it I'm under investigation and my life is ruined because the algorithm got it wrong. Even being accused of this sort of thing is enough to destroy someone and drive them to suicide.


The "algorithm" isn't some sort of neural network trying to intelligently identify things that "look like" CP. It's a perceptual hash matching against a database of known CP. It has to find multiple matches before it flags the account for review to reduce false positives. Only after review confirming a match to known exploitative images is the info referred to NCMEC for action.

Full whitepaper: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...


Who validates said database of "known CP"? How do we know the pictures are actually not "find dissident"? Who watches the watchers?

This entire endeavor hands the keys to an unaccountable police state.


"It is an effective deterrent" to using this one specific platform to distribute CSAM. The problem with this solution is the exact same problem with the tired old "solution" to E2E encryption that gets trotted out every couple of months. If you add monitoring to the tool that criminals are using -- especially, especially if the company loudly and publicly announces that they are adding monitoring! -- you will, at best, catch a few of the very dumbest possible criminals, while the rest move on to one of countless available non-monitored tools.


> anybody who stores this content in iCloud can be identified Any reason to believe anybody does this before the child reaches the retirement age?

You are describing the reverse cause of targeted search for the individual in which each step has probability that is much less than 100%. The technology discussed is a broad sweep: everything, everywhere, every time.

Oh, well, no need to worry so much, Apple has just added data collection ability which others had in one way or another.


As someone said:

if you really want to put a dent in abuse, mandate cameras in every home because that is where most abuse happenes. Anybody who opposes that clearly has something to hide...

Police can always pinky promise to never use it for anything except catching the baddies.

Yes. For anyone who wonders. This is sarcasm.


Algorithms can absolutely do the police work for crimes, provided they only perform searches that are supported with probable cause.


I don’t think fingerprinting and comparing against a database of cryptographic hashes of CSAM is “illegal searches”.


If the government mandated them, they would absolutely be illegal searches. The government is not entitled to check whether or not I have a document containing certain content (illegal or not) without probable cause and a warrant to suggest that I do have that document (and that it is illegal).

I'm carefully not alleging that this is illegal, because it's a third party doing it and it's not obvious that they are acting as the governments agent. Regardless of whether or not that solves the legal problem of unlawful searches, it does not solve the moral problem that we have a right to be free from unreasonable searches.

Replace "child porn" with "political posters" here. If you would have a problem with that search, I claim that you should have a problem with this search, because the there is no evidence that the person you are searching is committing a crime, and as a result the claim for this to be a morally valid search needs to not be about guilt (which would require probable cause first), but about it not being a prohibited search in the first place.


I'll be cool with it when you can get the algorithm to swear a warrant, testify in court and navigate probably cause.


Because the price is too high. The same argument is being made to outlaw encryption: it will make police work much easier. Yeah, it sure will, but it will bring a lot of problems as well, and they far outweigh the benefits.


Who's responsible when the algorithm starts targeting minorities and putting innocent people in prison?


Algorithms don't put people in jail, people do. The person who irresponsibly followed the output of the algorithm is, and the person who vouched for the algorithm being correct and put it into production is.

Don't think of an algorithm as an independent actor, think of it as an awesome tool that makes you 100x more productive.


> Algorithms don't put people in jail, people do.

... and guns and cars don't kill people, murderers do.

But now a company is creating s new kind of "gun" that huge chunks of the community of security professionals are warning about because it is insanely powerful and dangerous.

We have a number of reasons to warn about it:

- China will undoubtedly demand to have Apple scan for their hashes too. The databases these hashes are collected from are not public for obvious reasons so this means they can slip whatever they want into it. Maybe Apple will check the images before they send them over but that is not necessarily the case as far as I read it, also it takes one rogue employee to correctly classify as not abuse material but make a note of the account and send it back after work.

- This is not your average sha256 hash. This is perpetual hashes. They are made to catch not only the exact document but all kinds of variations of it. I have not specialized in perpetual hashes but as far as I can understand it goes without saying that the more resistant this is to modifications, the easier it becomes to create innocent images that triggers it.

- even if we had a magic algorithm that resonated with everything good and only matched the images we wanted there still is potential for abuse. When I was younger I browsed through the cache folder of my machine and I remember there being a lot of images there that I can't remember having seen on any site I visited. Now it is said that this algorithm will only flag images about to be uploaded so obviously cache folders won't be scanned. But once this tool is in place why won't governments start applying pressure to Apple to scan everything? And what prevents someones soon-to-be ex from downloading and slipping some into iCloud when one leaves the phone unlocked?

- Due process should sort many of the problematic cases here, but child abuse is for good reasons one of the worst things you can get accused. A mere accusation is often enough to ruins someone's life even if it later becomes clear to law enforcement that the person is innovent. Also most places have a way to go wrt due process.


Okay but what about the intense war going on for youtube content creators getti g banned or content struck right now. Or all the wrongful bans on platforms like facebook or Twitter.


The blame for that falls squarely on the

a) Executives at Alphabet, who are ultimately in charge of the companies decisions

b) The individual product managers, lawyers, engineers, and whoever else is involved in creating, maintaining, and running the system that bans people inappropriately.

The algorithm they are using is a tool that they are using to do this, using a tool does not absolve them of the responsibility in any way.


I'm still mostly thinking about an awesome tool that will target 100x more minority people and send 100x more innocent people in jail.


They could, and their first step would be to get a warrant.


There is a scale where that doesn’t work anymore.

If one child abuser can sell his contents to thousands via nothing but WhatsApp, word of mouth, and a Bitcoin wallet, what will you do to fight that?

I’m all for freedom of speech and freedom from fishing expeditions and what not, but if you don’t deter these things they will grow.


If (s)he sells to thousands via WhatsApp and word-of-mouth good old police work will have him/her bagged shortly.

WhatsApps only claim to be private is end-to-end encryption of messages. The moment a polkce officer opens the phone of a buyer they can identify the seller. Same if an undercover cop hears the word-of-mouth and buys.

This is just ordinary good police work that I whole-heartedly support.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: