>About this time, someone usually mocks "it's always about the kids, think about the kids." To those critics: They have not seen the scope of this problem or the long term impact. There is nearly a 1-to-1 relationship between people who deal in CP and people who abuse children. And they rarely victimize just one child. Nearly 1 in 10 children in the US will be sexually abused before the age of 18.
I think we have seen "think of the kids" used as an excuse for so many things over the years that the pendulum has now swung so far that some of the tech community has begun to think we should do absolutely nothing about this problem. I have even seen people on HN in the last week that are so upset by the privacy implications of this that they start arguing that these images of abuse should be legal since trying to crack down is used as a motive to invade people's privacy.
I don't know what the way forward is here, but we really shouldn't lose sight that there are real kids being hurt in all this. That is incredibly motivating for a lot of people. Too often the tech community's response is that the intangible concept of privacy is more important than the tangible issue of child abuse. That isn't going to be a winning argument among mainstream audiences. We need something better or it is only a matter of time until these type of systems are implemented everywhere.
> Too often the tech community's response is that the intangible concept of privacy is more important than the tangible issue of child abuse.
I think the actual idea, which the nuance is often lost (and let's be honest, some people aren't really aware of and are just jumping on the bandwagon), is that privacy is a right, and erosion of rights is extremely important because it has been shown many times in the past to have far reaching and poorly understood future effects.
It's unfortunate that some would choose to link their opposition to the idea with making the thing it's attempting to combat legal when it's something so abhorrent, but in general I think a push back on any erosion of our rights is a good and useful way for people to dedicate time and effort.
While we shouldn't lose sight that there are real children in danger and that this should help, we also shouldn't lose sight that there is plenty of precedence that this could eventually lead to danger and problems for a great many other people, children included.
>I think the actual idea, which the nuance is often lost (and let's be honest, some people aren't really aware of and are just jumping on the bandwagon), is that privacy is a right, and erosion of rights is extremely important because it has been shown many times in the past to have far reaching and poorly understood future effects.
The encryption everywhere mindset of a lot of the tech community is changing the nature of the right to privacy. 50 years ago all these images would have been physical objects. They could have been found by the person developing the photos or the person making copies. They could have been found with a warrant. Now they are all hidden behind E2EE and a secure enclave. To a certain extent the proliferation of technology means people can have an even stronger degree of privacy today than was practical in the past. It is only natural for people to wonder if that shift has changed where the line should be.
In the past these people would have been developing the pictures themselves, and handing them between each other either personally or in some hidden manner using public systems.
Not only has communication become easier, so has surveillance. The only difference now is that it's easier for people not to be aware when their very personal privacy is invaded, and that it can be done to the whole populace at once.
>Not only has communication become easier, so has surveillance.
It is a devil's advocate response, but can you explain why this is a bad thing? If communication is scaling and becoming easier, why shouldn't surveillance?
1. We judge these programs under the wrong assumption that it is run by the good guys. Any system that depends on the goodness of the people running it is dangerous, because it can be taken over by the bad guys.
2. I am a law abiding, innocent citizen. Why should I have to face the same compromised privacy as a criminal? It used to be that only people under suspicion are surveilled and that a judge had to grant this based on preliminary evidence. Now everyone is surveilled. How long until people who are sidestepping surveillance (i.e. using Open Source systems that don't implement this stuff), fall under suspicion just because they are not surveilled? How long until it's "guilty until proven otherwise"?
In my opinion it is absolutely fair and necessary to scan images that are uploaded to the cloud in a way that makes them shareable. But never the stuff I have on my device. And the scanning system has to be transparent.
>We judge these programs under the wrong assumption that it is run by the good guys
Speaking as someone who as a boy was tortured and trafficked by operatives of the Central Intelligence Agency of the United States of America, I am surprized how little appreciation there is for the standard role of misdirection in the espionage playbook.
It is inevitable that all these obstensibly well-intended investigators will have their ranks and their leadership infiltrated by intelligence agencies of major powers, all of whom invest in child-trafficking. What better cover?
> We judge these programs under the wrong assumption that it is run by the good guys. Any system that depends on the goodness of the people running it is dangerous, because it can be taken over by the bad guys.
And sometimes, the "good guys" in their attempts to be as good as they can imagine being, turn into the sort of person who'll look a supreme court judge or Congresspeople or oversight committees in the eye and claim "It's not 'surveillance' until a human looks at it." after having built PRISM "to gather and store enormous quantities of users’ communications held by internet companies such as Google, Apple, Microsoft, and Facebook."
Sure, we copied and stored your email archive and contact lists and instant messenger history. But we weren't "surveilling you" unless we _looked_ at it!
Who are these "good guys"? The intelligence community? The executive branch? The judicial branch? Because they're _all_ complicit in that piece of deceit about your privacy and rights.
Surveillance does have its place, but a large part of the problem with the new technology of surveillance is the people passing the laws on surveillance don't understand the technology that surrounds it.
Take, for instance, the collection of metadata that is now so freely swept up by the American government without a warrant. This includes the people involved in communication, the method of communication, time and duration of communication and locations of those involved with said communication. All of that is involved in the metadata that can be collected without a warrant by national agencies for "national security" done on a massive scale under PRISM (PRIZM?).
Now, this is non targeted, national surveillance, fishing for a "bad guy" with enhanced surveillance capabilities. This doesn't necessarily seem like a good thing. It seems like a lazy thing, and a thing which was ruled constitutional by people who chose to not understand the technology because it was easier than thinking down stream at the implications.
> is the people passing the laws on surveillance don't understand the technology that surrounds it.
And that the people running the surveillance have a rich track record of lying to the people who are considering whether to pass the laws they're proposing.
"Oh no, we would _NEVER_ surveil American citizens using these capabilities!"
"Oh yeah, except for all the mistakes we make."
"No - that's not 'surveillance', it's only metadata, not data. All we did was bulk collect call records of every American, we didn't 'surveil" them."
"Yeah, PRISM collects well over 80% of all email sent by Americans, but we only _read_ it if it matches a search we do across it. It's not 'surveillance'."
But they've stopped doing all that, right? And they totally haven't just shared that same work out amongst their five eyes counterparts so that what each of them is doing is legal in their jurisdiction even though there are strong laws preventing each of them from doing it domestically.
And how would we even know? Without Snowden we wouldn't know most of what we know about what they've been doing. And look at the thanks get got for that...
> It is a devil's advocate response, but can you explain why this is a bad thing?
I didn't state it as a bad thing, I stated it as a counter to your argument that encrypted communication is more common, and therefore maybe we should assess whether additional capabilities are warranted. Those capabilities already exist, and expanded before the increased encrypted communication (they happened during the analog phone line era). I would hazard that increasingly encrypted communication is really just people responding to that change in the status quo (or overreach, depending on how you view it) brought about by the major powers (whether they be governmental or corporate).
Why shouldn't every Neuralink come with a mandated FBI module preinstalled? Where, if anywhere, is private life to remain, where citizens think and communicate freely? Is Xinjiang just the start for the whole world?
Surely there is communication surrounding the child sexual abuse, making plans with other pedophiles and discussing strategies. Some of this may occur over text message. Maybe Apple’s next iteration of this technology can use my $1000 phone that I bought and that I own to surveil my E2EE private encrypted messages on my phone and generate a safety voucher for anything I say? The sky’s the limit!
Do you see how this underlines my original point? I brought up real child abuse that is happening today and your response is about brain implants being mandated by the government. Most people are going prioritize the tangible problem over worrying about your sci-fi dystopia.
A plausible story of how our rights are in the way is always ready to hand. If we can't at some point draw a clear line and say no, that's it, stop -- then we have no rights. It's chisel, chisel, chisel, year after year, decade after decade.
In America one of those lines is that your personal papers are private. Get a warrant. I don't have to justify this stand. I might choose to explain why it's a good stand, or I might not; it's on you to persuade us.
>In America one of those lines is that your personal papers are private. Get a warrant. I don't have to justify this stand. I might choose to explain why it's a good stand, or I might not; it's on you to persuade us.
Part of the problem is that these devices are encrypted so a warrant doesn't work on them. That is a big enough change that maybe people need to debate if the line is still in the right place.
That is a change worth considering, though it must be treated at the level of rights, not just a case-by-case utility calculus. At the same time, most other changes have been towards more surveillance and control: cameras everywhere, even in the sky; ubiquitous location tracking; rapidly improving AI to scale up these capabilities beyond what teams of humans could monitor; tracking of most payments; mass warrantless surveillance by spy agencies; God knows what else now, many years after the Snowden leaks. This talk you hear about the population "going dark" is... selective.
I think my vehemence last night might've obscured the point I wanted to make: what a right is supposed to be is a principle that overrides case-by-case utility analysis. I would agree that everything is open to questioning, including the right to privacy -- but as I see it, if you ask what's the object-level balance of utilities with respect to this particular proposal, explicitly dropping that larger context of privacy as a right (which was not arrived at for no reason) and denigrate that concern as science fiction, as a slippery-slope fallacy -- then that's a debate that should be rejected on its premise.
That's too short a thought, it detracts from the point.
It is bad to post passwords, not just because you lose privacy, but because you'd lose control of important accounts.
Asking people to post their passwords is not reasonable.
I think you might have a point you're trying to make, but please spell it out fully.
> Too often the tech community's response is that the intangible concept of privacy is more important than the tangible issue of child abuse.
Is it intangible? 18% of the world lives in China alone. That's more people than the "1/10 who are victims of child abuse*", and I'm sure that 18% will only grow as other authoritarian countries get more technologically advanced.
I think "Think of the kids" applies very well to the CREATORS of pornography. Per wikipedia, there isn't any conclusive causal relationship between viewing CP and assaulting children.
* Per a google search "A Bureau of Justice Statistics report shows 1.6 % (sixteen out of one thousand) of children between the ages of 12-17 were victims of rape/sexual assault" which is a lot less than 10% figure you're citing. Non-sexual abuse wouldn't really have any bearing here, right?
>Is it intangible? 18% of the world lives in China alone. That's more people than the "1/10 who are victims of child abuse*", and I'm sure that 18% will only grow as other authoritarian countries get more technologically advanced.
You didn't mention any tangible results here. How would this system by Apple make my life worse? Can you answer that without a slippery slope argument?
>I think "Think of the kids" applies very well to the CREATORS of pornography. Per wikipedia, there isn't any conclusive causal relationship between viewing CP and assaulting children.
Why does the causality matter? A correlation is enough that cracking down on this content will result in less abusers on the streets.
>* Per a google search "A Bureau of Justice Statistics report shows 1.6 % (sixteen out of one thousand) of children between the ages of 12-17 were victims of rape/sexual assault" which is a lot less than 10% figure you're citing. Non-sexual abuse wouldn't really have any bearing here, right?
I wasn't the one citing that, but you are also citing an incomplete number since it excludes younger children.
> Can you answer that without a slippery slope argument?
So far any defense of this whole fiasco can be boiled down to what you are trying to imply in part. When you say "The possibility of abusing this system is a slippery slope argument", as if identifying a (possible) slippery slope element in an argument would somehow automatically make it invalid?
The other way around if all that can be said in defense is that the dangers are part of slippery slope thinking, then you are effectively saying that the only defense is "trust them, let's wait and see, they might not do something bad with it" or "it sure doesn't affect me" (sounds pretty similar to "I've got nothing to hide"). This might work for other areas, not so much when it comes to backdooring your device/backups for arbitrary database checks.
And since "oh the children" or "but the terrorists" has become the vanilla excuse for many many things I'm unsure why we are supposed to believe in a truly noble intent down the road here. "No no this time it's REALLY about the kids, the people at work here mean it" just doesn't cut it anymore.
So no, I'm not convinced the people at Apple working on this actually do it because they care.
When "but the children" becomes a favourite excuse to push whatever, the problem are very much the people abusing this very excuse to this level, not the ones becoming wary of it.
> some of the tech community has begun to think we should do absolutely nothing about this problem
I don't believe that people think that, I believe that people rather think that the ones in power simply aren't actually mainly interested in this problem. The trust is (rightfully) heavily damaged.
>> You didn't mention any tangible results here. How would this system by Apple make my life worse? Can you answer that without a slippery slope argument?
That's a weird goal-post.
>> Why does the causality matter? A correlation is enough that cracking down on this content will result in less abusers on the streets.
Obviously of the people who look at cp, a higher percentage of those will be actual child abusers. The question for everybody is -- does giving those people a fantasy outlet increase or actually reduce the number of kids who get assaulted. At the end of the day that's what matters.
>> I wasn't the one citing that, but you are also citing an incomplete number since it excludes younger children.
[EDIT: Mistake] Well you didn't cite anything at all, and were off by a shockingly large number. Please cite something or explain why you made up a number.
> How would this system by Apple make my life worse? Can you answer that without a slippery slope argument?
“With the first link, the chain is forged. The first speech censured, the first thought forbidden, the first freedom denied, chains us all irrevocably. The first time any man's freedom is trodden on we're all damaged...."
This is melodramatic high school Hamlet-ism. It’s also silly - there has obviously been a case where the first speech was censured. It happened before civilizations. Are we all still damaged and bondaged by that?
Look, speech is important. So is protecting the public good. But if one believes in absolutes, rather than takeoffs, they are IMO getting too high on their own supply.
Let’s talk about the trade offs that we have already made.
Does the fact that the NSA can comb your personal files and look at people's nude photographs not concern you? That's a present day reality brought to light by Snowden. Showing a colleague a 'good find' was considered a perk of the job.
We're lying to ourselves if we think this couldn't be abused and can implicitly be trusted. We should generally be sceptical of closed source at the best of times, let alone when it's inherently designed to report on us.
To your point of 'as a layman end user what is the cost to me?': more code running on your computer doing potentially anything which you have no way to audit -> compromising the security of whatever is on your computer, and an uptick in cpu/disk/network utilisation (although it remains to be seen if it's anything other than negligible).
My defeated mentality is partly - 'well they're already spying on us anyway'...
> This is melodramatic high school Hamlet-ism. It’s also silly - there has obviously been a case where the first speech was censured. It happened before civilizations. Are we all still damaged and bondaged by that?
The implication -- and I think it's a valid one -- is that this client-side mechanism will be very quickly co-opted to also alert on non-CSAM material. For example, Winnie the Pooh memes in China.
I think it’s not valid to claim that it will be quickly used for that purpose.
However I absolutely agree that it could be used to detect non-CSAM images if Apple colludes with that use case.
My point is that this is immaterial to what is going on in China. China is already an authoritarian surveillance state. Even without this, the state has access to the iCloud photo servers in China, so who knows what they are doing, with or without Apple’s cooperation.
> Now people are seriously arguing that continuous searching through your entire life without any warrant or justification by opaque algorithms is fine.
Does anyone seriously doubt that Germany will use this mechanism to ban Nazi imagery?
Then from there, it’s not a big leap to talk about controlling far right (or far left) memes in France or the UK.
More insidiously, suppose some politician in a western liberal democracy was caught with underage kids, and there were blackmail photos that leaked. Do you think those hashes wouldn’t instantly make their way onto this ban list?
I’ll let you change the subject, but let’s note that every time someone realizes privacy in China as a concern, it’s just bullshit.
> Does anyone seriously doubt that Germany will use this mechanism to ban Nazi imagery?
Yes.
> Then from there, it’s not a big leap to talk about controlling far right (or far left) memes in France or the UK.
This one is harder for me to argue against. Those countries could order such a mechanism, whether Apple had built this or not. Because those countries have hate speech laws and no constitutional mechanism protecting freedom of speech.
This is a real problem, but banning certain kinds of speech is popular in these societies. It is disturbingly popular in the US too. That is not Apple’s doing.
During the Cold War, the West in general and the US in particular were proud of spreading freedom and democracy. Rock & roll and Levi’s played a big role in bringing down the USSR.
Then in the 90s, the internet had this same ethos. People fought against filters on computers in libraries and schools.
Now that rich westerners have their porn and their video games, apparently many are happy to let the rest of the world rot.
Agreed in principle. But in this particular case, I think it's difficult to exaggerate the badness of this scare. This strikes me as one of the "Those who forget their history are doomed to repeat it" kind of things.
Like with the TSA and the no-fly list. Civil liberties groups said it was going to be abused, and they said so well before any actual abuse had occurred. But they weren't overreacting, and they weren't exaggerating. They were right. Senator Ted Kennedy even wound up on that list at one point.
This really is a narrowly targeted solution that only works with image collections, and requires two factors to verify, and two organizations to cooperate, one of which is Apple who has unequivocally staked their reputation on it not being used for other purposes, and the other is NCMEC which is a non-profit staffed with people dedicated to preventing child abuse.
People who are equating this with a general purpose hashing or file scanning mechanism are just wrong at best.
What tangible impact in the life of an everyday Chinese citizen are you expecting if Apple offered an E2EE messging and cloud backup service in China? And why do you think the Chinese government would not just ban it and criminalise anyone found using a device which connected to Apple's servers, rendering any benefits moot?
(And why do you think it's morally right, or the responsibility of a foreign private company to try and force anything into China against their laws? Another commenter in a previous thread said the idea was for Apple to refuse to do business there - but that still leads to the question, how would that help anyone?)
> I think "Think of the kids" applies very well to the CREATORS of pornography. Per wikipedia, there isn't any conclusive causal relationship between viewing CP and assaulting children.
“Think of the Kids” damn well applies to the consumers of this content - by definition, there is a kid (or baby in some instances) involved in the CP. As a society, the United States draws the line at age 18 as the age of consent [the line has to be drawn somewhere and this is a fairly settled argument]. So by definition, in the United States, these are not consenting victims in the pictures.
Demand drives creation. Getting rid of it on one of the largest potential viewing and sharing platforms is a move in the right direction in addressing the problem.
What I haven’t seen from the tech community is the idea that this will be shut down if it goes too far or beyond this limited scope. Which I think it would be - people would get rid of iPhones if some of the other cases privacy advocates are talking about occur. And at that point they would have to scrap the program - so Apple is motivated to keep it limited in scope to something everyone can agree is abhorrent.
>Demand drives creation. Getting rid of it on one of the largest potential viewing and sharing platforms is a move in the right direction in addressing the problem.
Yeah, that focus has worked really well in the "war on some drugs," hasn't it?
I don't pretend to have all (or any good ones for that matter) the answers, but we know interdiction doesn't work.
Those who are going to engage in non-consensual behavior (with anyone, not just children) are going to do so whether or not they can view and share records of their abuse.
The current legal regime (in the US at least) creates a gaping hole where even if you don't know what you have (e.g., if someone sends you a child abuse photo without your knowledge or consent) you are guilty, as possession of child abuse images is a felony.
That's wrong. I don't know what the right way is, but adding software to millions of devices searching locally for such stuff creates an environment where literally anyone can be thrown in jail for receiving an unsolicited email or text message. That's not the kind of world in which I want to live.
Many years ago, I was visiting my brother and was taking photos of his two sons, at that time aged ~4.5 and ~2.
I took an entire roll of my brother, his wife and their kids. In one photo, the two boys are sitting on a staircase, and the younger one (none of us noticed, as he wasn't potty trained and hated pants) wasn't wearing any pants.
I took the film to a processor and got my prints in a couple of days. We all had a good laugh looking at the photos and realizing that my nephew wasn't wearing any pants.
There wasn't, when the photos were taken, nor when they were viewed, any abuse or sexual motives involved.
Were that to happen today, I would be sitting in a jail cell, looking at a lengthy prison sentence. And when done "repaying my debt to society" I'd be forced to register as a sex offender for the rest of my life.
Which is ridiculous on its face.
Unless and until we reform these insane and inane laws, I can't support such programs.
N.B.: I strongly believe that consent is never optional and those under the age of consent cannot do so. As such, there should absolutely be accountability and consequences for those who abuse others, including children.
> Were that to happen today, I would be sitting in a jail cell, looking at a lengthy prison sentence.
No you would not, I was ready to somewhat agree with you but this is just false and has nothing to do with what you were talking about before. The law does not say that naked photos of (your or anyone else's) kids are inherently illegal, they have to actually be sexual in nature. And while the line is certainly not all that clear cut, a simple picture like you're describing would never meet that line.
I mean let's be clear here, do you believe the law considers to much stuff to be CSAM, and if so why? How would you prefer we redefine it?
> The law does not say that naked photos of (your or anyone else's) kids are inherently illegal, they have to actually be sexual in nature.
But that depends on who looks at it.
People have been arrested and (at least temporary) lost custody over their children because someone called the police over perfectly normal family photos. I remember one case a few years ago where someone had gotten into trouble because one photo included a toddler wearing nothing (even facing away from the camera if my memory serves me correctly) playing at the beach. When police realized this wasn't an offense, instead of apologizing they got hung up on another photo were kids were playing with an empty beer can.
which further links to a couple of previous cases.
I'd say we get police or health care to talk to people who think perfectly normal images are sexual in nature, but until we get laws changed at least then keep us safe.
> I mean let's be clear here, do you believe the law considers to much stuff to be CSAM, and if so why? How would you prefer we redefine it?
Another thing that comes up is that a lot of things that are legal might be in that database because criminal might have a somewhat varied history.
Personally I am a practicing conservative Christian so this doesn't bother me personally at the moment since for obvious reasons I don't collect these kinds of images.
The reason I care is because every such capability will be abused, and below I present in two easy steps how it will go from todays well intended system to a powerful tool for oppression:
1. today it is pictures but if getting around it is as simple as putting it in a pdf then obviously pdfs must be screened too. Same with zip files. Because otherwise this so simple to circumvent that is worthless.
2. once you have such a system in place it would be a shame not to use it for every other evil thing. Dependending on where you live this might be anything: Muslim scriptures, Atheist books or videos, Christian scriptures, Winnie the Pooh drawings - you name it and someone wants to censor it.
As soon as it is used in a negative way beyond CSAM scanning, it will cause people to sell their phones and stop using apple products.
If Apple starts using the tech to scan for religious material, there will be significant market and legal backlash. I think the fact that CSAM scanning will stop if they push it too far will keep them in check to only do CSAM scanning.
Everyone can agree on using the tech for CSAM, but beyond that I don’t see Apple doing it. The tech community is reacting as if they already have.
Problem one is Apple doesn't know what they are scanning for.
This is by design and actually a good thing.
It becomes a problem because problem number 2:
No one is accountable if someone gets their life ruined over a mistake in this database.
I'd actually be somewhat less hostile to this idea if there was more regulatory oversight:
- laws that punishes police/officials if innocent people are harmed in any way
- mandatory technical audits as well as verification that for what it is used for: Apple keeps logs of all signatures that "matched"/triggered as well as raw files, these are provided to the court as part of any case that comes up. This way we could hopefully prevent most fishing expeditions - both wide and personalized ones - and also avoid any follow up parallel reconstructions.
I'm not saying I'd necessarily be OK with it but at that point there would be something to discuss.
It may be worth taking very seriously that you might be overestimating both how quickly regular people become aware of such events and how emphatically people will react.
> I'd say we get police or health care to talk to people who think perfectly normal images are sexual in nature, but until we get laws changed at least then keep us safe.
Personally I don't find anecdotes convincing compared to the very real amount of CSAM (and actual child abuse) we already know exists and circulates in the wild, but I do get your point. That said personally I don't think changing the laws would really achieve what you want anyway - I don't think a random Walmart employee is up-to-date on the legal definitions of CSAM, they're going to potentially report it regardless of what the law is (and the question of whether this is a wider trend is debatable, again this is an anecdote).
With that, they were eventually found innocent, so the law already agrees what they did was perfectly fine, which was my original point. No it should not have taken that long, but then again we don't know much about the background of those who took them, so I'm not entirely sure we can easily determine how appropriate the response was. I'm certainly not trying to claim our system is perfect, but I'm also not convinced rolling back protections for abused children is a great idea without some solid evidence that it really isn't working.
> Another thing that comes up is that a lot of things that are legal might be in that database because criminal might have a somewhat varied history.
That didn't really answer my question :P
I agree the database is suspect but I don't see how that has anything to do with the definition of CSAM. The legal definition of CSAM is not "anything in that database", and if we're already suggesting that there's stuff in there that's known to not be CSAM then how would changing the definition of CSAM help?
> Personally I don't find anecdotes convincing compared to the very real amount of CSAM (and actual child abuse) we already know exists
First: This is not hearsay or anecdotal evidence, this is multiple innocent real people getting their lives trashed to some degree before getting aquitted.
> I don't think a random Walmart employee is up-to-date on the legal definitions of CSAM, they're going to potentially report it regardless of what the law is (and the question of whether this is a wider trend is debatable, again this is an anecdote).
Fine, I too report a number of things to the police that might or might not be crimes. (Eastern European car towing a Norwegian luxury car towards the border is one. Perfectly legal in one way but definitely something the police was happy to get told about so they could verify.)
> With that, they were eventually found innocent, so the law already agrees what they did was perfectly fine, which was my original point.
Remember the job of the police is more to keep law abiding citizens safe than to lock up offenders. If we could magically keep kids safe forever without catching would-be offenders I'd be happy with that.
Making innocent peoples lives less safe for a marginally bigger chance to catch small fry (i.e. not producers), does it matter?
The problem here and elsewhere is that police many places doesn't have a good track record of throwing it out. Once you've been dragged through court for the most heinous crimes you don't get your life completely back.
If we knew police would always throw out such cases I'd still be against this but then it wouldn't be so obviously bad.
> First: This is not hearsay or anecdotal evidence, this is multiple innocent real people getting their lives trashed to some degree before getting aquitted.
"multiple" is still anecdotal, unless we have actual numbers on the issue. The question is how many of these cases actually happen vs. the number of times these types of investigations actually reveal something bad. Unless you never want kids saved from abuse there has to be some acceptable number of investigations that eventually get dropped.
> Remember the job of the police is more to keep law abiding citizens safe than to lock up offenders.
Maybe that should be their purpose, but in reality they're law enforcement, their job has nothing to do with keeping people safe. The SCOTUS has confirmed as much that the police have no duty to protect people, only to enforce the law. However I think we agree that's pretty problematic...
> Making innocent peoples lives less safe for a marginally bigger chance to catch small fry (i.e. not producers), does it matter?
I would point out that the children in this situation are law abiding citizens as well, and they also deserve protection. Whether their lives were made more or less safe in this situation is debatable, but the decision was made with their safety in mind. For the few cases of a mistake being made like the one you presented I could easily find similar cases where the kids were taken away and then it was found they were actually being abused. That's also why I pointed out your examples are only anecdotes, the big question is whether this is a one-off or a wider trend.
If reducing the police's ability to investigate these potential crimes would actually result in harm to more children, then you're really not achieving your goal of keeping people safer.
> The problem here and elsewhere is that police many places doesn't have a good track record of throwing it out. Once you've been dragged through court for the most heinous crimes you don't get your life completely back.
Now this I agree with. The "not having a good record of throwing it out" I'm a little iffy on but generally agree, but I definitely agree that public knowledge of being investigating for such a thing is damaging even if it turns out your innocent, which isn't right. I can't really say I have much of a solution for that in a situation like this though, I don't think there's much of a way to not-publicly take the kids away - and maybe that should have a higher threshold, but I really don't know, as I mentioned earlier we'd really need to look at the numbers to know that. For cases that don't involve a public component like that though I think there should be a lot more anonymity involved.
A large? portion of “sexual predators” are men peeing on the side of the interstate[1]. So it’s not far-fetched to think that a random pic would also land you in jail.
[1]: I looked up the cases of sex offenders living around me several years ago.
Random pictures aren’t going to be in the CSAM database and trigger review. And to have multiple CSAM hash matches on your phone is incredibly unlikely.
An unsolicited email / having photos planted on your phone or equipment is a problem today as much as it will be then, but I think people over-estimate the probability of this and ignore the fact it could easily happen today, with an “anonymous tip” called into the authorities.
If they are scanning it though iMessage they will have logs of when it arrived, and where it came from as well - so in that case it might protect the victim being framed.
That is such a tortured, backwards argument, but the only one that has even a semblance of logic so it gets repeated ad nauseam. Why be so coy about the real reasons?
Any reach into privacy, even while "thinking of the kids" is an overreach. Police can't open your mail or search your house without a warrant. The same should apply to your packets and your devices.
Why not police these groups with.. you know.. regular policing? Infiltrate, gather evidence, arrest. This strategy has worked for centuries to combat all manner of organized crime. I don't see why it's any different here.
Devil's advocate: It may not be mostly big organized crime.
It may be hard to fight, because it's not groups. It mostly comes from people close to the families, or the families themselves.
Here's a relevant extract sourced from Wikipedia:
"Most sexual abuse offenders are acquainted with their victims; approximately 30% are relatives of the child, most often brothers, sisters, fathers, mothers, uncles or cousins; around 60% are other acquaintances such as friends of the family, babysitters, or neighbours; strangers are the offenders in approximately 10% of child sexual abuse cases.[53] In over one-third of cases, the perpetrator is also a minor.[56]"
Content warning: The WP article contains pictures of abused children.
Sure, but the people sharing these images do so in organized groups, often referred to as "rings". I agree it would be very hard to catch a solitary perpetrator abusing children and not sharing the images. However since they would be creating novel images with new hashes, Apple's system wouldn't do much to help catch them would it?
The laws regarding CSAM/CSA are not the problem, they are fine. The problem is that we are expected to give up our rights in the vague notion of 'protecting children' while the same authorities actively ignore ongoing CSA. The Sophie Long case is an excellent example where the police has no interest in investigating allegations of CSA. Why is it that resources are spent policing CSAM but not CSA? It is because it is about control and eroding our right to privacy.
I agree that our current legal and law enforcement system isn't up to speed with 21st century internet. And it has to be updated, because this kind of makes the internet a de-facto law less space, covering everything from stalking, harassment over fraud to contraband trafficking and CSAM.
I don't think full scale surveillance is the way to go in free, democratic societies. It is the easiest one, so. Even more if surveillance can be outsourced to private, global tech companies. It saves the pain of passing laws.
Talking about laws, those along with legal proceedings should be brought up to speed. If investigators, based on existing and potential new laws, convince a court to issue a warrant to surveil or search any of my devices, fine. Because then I have legal options to react to that. Having that surveillance incorporated into some opaque EULA from a company in US, a company that now can enforce its standards on stuff I do, is nothing I would have thought would be acceptable. Not that I am shocked it is, I just wonder why that stuff still surprises me when it happens.
Going one step forward, if FANG manages to block right to repair laws it would be easy to block running non-approved OS on their devices. Which would than be possible to enforce by law. Welcome to STASI's wet dream.
All of the 'bad things' you mention are already very illegal. Changing the laws in this case will only lead to tyranny. I cannot emphasize this enough, you cannot fix societal ills by simply making bad things illegal. Right to repair is of course crucial to free society.
By laws I mean the laws governing police work. And bringing police up to speed. Obviously CP and other things are already very much illegal, these laws are just hardly enforced online. That has to change.
Stalking and harrasment isn't, at least over here. Victims are constantly left out in the cold. Same goes for fraud, most cases are not prosecuted. Especially if these cases cross state, and in the EU, nation borders. Because it becomes inconvenient, so police isn't really bothering. And if they do, the fraud is done. The stalking went on for years. And nothing really improved.
Hell, do I miss the old internet. The one without social media.
International fraud is a really interesting problem. I've proposed a mandatory international money transfer insurance that would pay out in case of court decided fraud. It would make doing business with corrupt countries that look the other way on fraud within their borders crack down to preserve their international market access.
I have hands on experience with, what I'd call at least attempted fraud, with crypto. Back when Sweden thought about state backed crypto, a lot of ads showed up where you could invest in that. I almost did, call centers used Austrian numbers. Not sure if there was even any coin behind that. I reported it to police, got a letter after a couple of months that the investigation led nowhere and was dropped, apparently Austrian authorities did find anything on the reported number.
A couple of hours online found
- the company behind that operated out of the UK
- the call center was not in Austria but used local numbers for a while
- company was known for that shady business but never even got a slap on the wrist
I decided to never count on authorities for that kind of fraud. Or report it, because that's just a waste of time, unless you lost a shitload of money.
There's a lot of dumb fraud. Being international of course makes everything harder. Usually the amount of investigation is related to how many people were scammed. People need to learn how to do due diligence because the definition of fraud can get vague at times. I think your example is a good case of due diligence, it couldn't hurt to blog about their fraud though.
People did write about it, that's how I found out so much so quickly. I didn't invest, but cake reasonably close. One could call it due diligence, but I can see how easy it is to fall for things like that. I got a lot more sceptical of these ads, even more so than before.
It’s not just thinking of the victims when they are kids - if they aren’t killed after the material is made, then they have issues for the rest of their life with a real cost to society.
We’re talking a life-long impact from being abused in that stuff…
I think we have seen "think of the kids" used as an excuse for so many things over the years that the pendulum has now swung so far that some of the tech community has begun to think we should do absolutely nothing about this problem. I have even seen people on HN in the last week that are so upset by the privacy implications of this that they start arguing that these images of abuse should be legal since trying to crack down is used as a motive to invade people's privacy.
I don't know what the way forward is here, but we really shouldn't lose sight that there are real kids being hurt in all this. That is incredibly motivating for a lot of people. Too often the tech community's response is that the intangible concept of privacy is more important than the tangible issue of child abuse. That isn't going to be a winning argument among mainstream audiences. We need something better or it is only a matter of time until these type of systems are implemented everywhere.