Does it matter? Unless they're going to totally change the technology I don't see how they can do anything but buy time until it's reverse engineered. After all, the code runs locally.
If Apple wants to defend this they should try to explain how the system will work even if generating adversarial images is trivial.
Apple has outlined[1] multiple levels of protection in place for this:
1. You have to reach a threshold of matches before your account is flagged.
2. Once the threshold is reached, the matched images are checked against a different perceptual hash algorithm on Apple servers. This means an adversarial image would have to trigger a collision on two distinct hashing algorithms.
3. If both hash algorithms show a match, then “visual derivative” (low-res versions) of the images are inspected by Apple to confirm they are CSAM.
Only after these three criteria are met is your account disabled and referred to NCMEC. NCMEC will then do their own review of the flagged images and refer to law enforcement if necessary.
I don’t believe Apple has said whether or not they send them in their initial referral to NCMEC, but law enforcement could easily get a warrant for them. iCloud Photos are encrypted at rest, but Apple has the keys.
(Many have speculated that this CSAM local scanning feature is a precursor to Apple introducing full end-to-end encryption for all of iCloud. We’ll see.)
You are correct — most of the iCloud data is not end-to-end encrypted. Apple discusses which data is end-to-end encrypted at https://support.apple.com/en-us/HT202303
> As far as this is concerned, seems like if you don’t use iMessage or iCloud you’re safe for now.
Yes, this is correct. The Messages feature only applies to children under 18 who are in an iCloud Family, and the photo library feature only applies if you are using iCloud Photos.
Oh come on, you really think thats their big plan? Announcing the scanning SW in public and then abuse it? If they want to to illegal spying they do it right. And without a second Snowden you will not hear about it.
I don't think it has anything to do with age. It has everything to do with you adding the phone to your family under settings and declaring that it belongs to a child. You control the definition of child.
I could imagine an abusive partner enabling this to make sure their partner isn’t sexting other people. Given the pushback for AirTags I’m surprised people aren’t more concerned.
Anyone 13 or older can remove themselves from a family sharing group. The only exception is if screen time is enabled and enforced for their device.
Frankly, if you have an abusive partner with physical control over you and a willingness to do this, the fact that Apple supports this technology is the least of your problems.
I’m not sure I’m misunderstanding. This is another feature that allows someone with access to another person’s phone to enable stalkerware like features.
- Matching against a known set of CSAM (Child Sexual Abuse Material) hashes occurs on-device (as opposed to the on-server matching done by many other providers)
- Multiple matches (unspecified threshold) are required to trigger a manual review of matched photos and potential account suspension
You should reread the first section of TFA, titled "Communication safety in Messages." This goes beyond the scope of CSAM: they're scanning all Messages photos sent to or from a minor's account for any possible nudity, not just CSAM hash-matching.
It sure seems like this is two different techniques/technologies at work. There is a CSAM deterctor using a known database and then, it looks like, there is a separate model in messages for detecting pornographic content without a known database.
It doesn't sound like it can "detect pornographic content" since the difference between pornography and nudity is not going to be reducible to a coefficient matrix.
The page you linked to is for the Primetime Emmys, which are the biggest awards that make it into the main TV broadcast (Prime Video has 4 nominations there). But there are also the Creative Arts Emmys, which include many more awards, often in more technical categories (where Prime has an additional 14 nominations).
There are some features for group messaging on iMessage that aren’t available in Group MMS. If you add a non-iMessage user to your group, the group downgrades to using Group MMS, which does still work for basic messaging, but the group loses all of its iMessage-exclusive features.
Because the M1 chip has two RAM options in laptops today: 8GB and 16GB. There’s two chip spots on the processor and either one spot is filled or both are. Apple is just putting those same two options into the iPad Pro, and since they have a high-end SKU at 16GB, may as well include it in the high-end SKUs at 1/2TB.
I used to do culling in photo mechanic, then light edits in Lightroom, and heavier in photoshop. I almost exclusively use Lightroom on iPad for 95% of everything now. Not a pro, just an enthusiast - and don't take nearly as many photos now due to covid.