Hacker Newsnew | past | comments | ask | show | jobs | submit | ncw96's commentslogin


Vaccines.gov seems to be kept up to date for people in the US.


Apple has said this is not the final version of the hashing algorithm they will be using: https://www.vice.com/en/article/wx5yzq/apple-defends-its-ant...


Does it matter? Unless they're going to totally change the technology I don't see how they can do anything but buy time until it's reverse engineered. After all, the code runs locally.

If Apple wants to defend this they should try to explain how the system will work even if generating adversarial images is trivial.


Apple has outlined[1] multiple levels of protection in place for this:

1. You have to reach a threshold of matches before your account is flagged.

2. Once the threshold is reached, the matched images are checked against a different perceptual hash algorithm on Apple servers. This means an adversarial image would have to trigger a collision on two distinct hashing algorithms.

3. If both hash algorithms show a match, then “visual derivative” (low-res versions) of the images are inspected by Apple to confirm they are CSAM.

Only after these three criteria are met is your account disabled and referred to NCMEC. NCMEC will then do their own review of the flagged images and refer to law enforcement if necessary.

[1]: https://www.apple.com/child-safety/pdf/Security_Threat_Model...


I do want to note that decrypting the low-res images would have to happen before step 2.


Doesn't disabling the account kind also defeat the whole purpose?

I mean assuming the purpose is to catch child abusers and not merely to use this particular boogeyman to introduce a back door for later use.


Will the high-resolution images be collected and used as evidence? Or just the visual derivatives? That's not clear.


Currently, most likely.

I don’t believe Apple has said whether or not they send them in their initial referral to NCMEC, but law enforcement could easily get a warrant for them. iCloud Photos are encrypted at rest, but Apple has the keys.

(Many have speculated that this CSAM local scanning feature is a precursor to Apple introducing full end-to-end encryption for all of iCloud. We’ll see.)


You are correct — most of the iCloud data is not end-to-end encrypted. Apple discusses which data is end-to-end encrypted at https://support.apple.com/en-us/HT202303


> As far as this is concerned, seems like if you don’t use iMessage or iCloud you’re safe for now.

Yes, this is correct. The Messages feature only applies to children under 18 who are in an iCloud Family, and the photo library feature only applies if you are using iCloud Photos.


Ha ha. They have a fully functional spying software installed on the phone and the government will stop at these restrictions?


Oh come on, you really think thats their big plan? Announcing the scanning SW in public and then abuse it? If they want to to illegal spying they do it right. And without a second Snowden you will not hear about it.


I’m fairly certain the age is different per region and hopefully tied to the age of consent (in this particular case).


I don't think it has anything to do with age. It has everything to do with you adding the phone to your family under settings and declaring that it belongs to a child. You control the definition of child.


I could imagine an abusive partner enabling this to make sure their partner isn’t sexting other people. Given the pushback for AirTags I’m surprised people aren’t more concerned.


Anyone 13 or older can remove themselves from a family sharing group. The only exception is if screen time is enabled and enforced for their device.

Frankly, if you have an abusive partner with physical control over you and a willingness to do this, the fact that Apple supports this technology is the least of your problems.


Except this would require consent of the abused partner when creating the account to set an age <13yo.

You can’t ser this to other accounts on you family remotely.


You’re misunderstanding what this is if this is an actual concern of yours.


I’m not sure I’m misunderstanding. This is another feature that allows someone with access to another person’s phone to enable stalkerware like features.


Would artificially inflating every child’s age to 18+ eliminate the iMessage problem


Ending of fourth paragraph:

> This feature can be turned on or off by parents.


A summary of the photo scanning system:

- Only applies to photos uploaded to iCloud

- Matching against a known set of CSAM (Child Sexual Abuse Material) hashes occurs on-device (as opposed to the on-server matching done by many other providers)

- Multiple matches (unspecified threshold) are required to trigger a manual review of matched photos and potential account suspension


You should reread the first section of TFA, titled "Communication safety in Messages." This goes beyond the scope of CSAM: they're scanning all Messages photos sent to or from a minor's account for any possible nudity, not just CSAM hash-matching.


It sure seems like this is two different techniques/technologies at work. There is a CSAM deterctor using a known database and then, it looks like, there is a separate model in messages for detecting pornographic content without a known database.


It doesn't sound like it can "detect pornographic content" since the difference between pornography and nudity is not going to be reducible to a coefficient matrix.


Yeah but that just blurs the image and notifies the parents. Pretty low impact for false positives.


Sorry for the confusion — I was referring to just the CSAM hash feature that uploads results to iCloud.

There is also scanning for nudity in the Messages app, but those scans happen on-device and the photos stay on-device even if nudity is detected.


Apple has been scanning iCloud for child abuse since (at least) Jan 2020 [1] [2]

[1] https://www.telegraph.co.uk/technology/2020/01/08/apple-scan... [2] https://web.archive.org/web/20200110193302/https://www.apple...


Read the article again. The on device scanning makes hashes on the local device


The page you linked to is for the Primetime Emmys, which are the biggest awards that make it into the main TV broadcast (Prime Video has 4 nominations there). But there are also the Creative Arts Emmys, which include many more awards, often in more technical categories (where Prime has an additional 14 nominations).

https://en.m.wikipedia.org/wiki/73rd_Primetime_Creative_Arts...


Title should have a (2020)


There are some features for group messaging on iMessage that aren’t available in Group MMS. If you add a non-iMessage user to your group, the group downgrades to using Group MMS, which does still work for basic messaging, but the group loses all of its iMessage-exclusive features.


I think the really interesting question here is why does an iPad Pro have 16 GB of RAM at all?

The previous generation of iPad Pros maxed out at 6 GB, so this is quite a jump.

Adding Thunderbolt support is also a bit of a mystery.

I suspect this will all become clear at WWDC in June when Apple announces iPadOS 15.


Because the M1 chip has two RAM options in laptops today: 8GB and 16GB. There’s two chip spots on the processor and either one spot is filled or both are. Apple is just putting those same two options into the iPad Pro, and since they have a high-end SKU at 16GB, may as well include it in the high-end SKUs at 1/2TB.


High speed external storage & full XDR monitor support is a godsend for people like me who use their iPads to do all their photography work.


What is your workflow? Do you do your culling -> light edit -> retouch in different apps?


I used to do culling in photo mechanic, then light edits in Lightroom, and heavier in photoshop. I almost exclusively use Lightroom on iPad for 95% of everything now. Not a pro, just an enthusiast - and don't take nearly as many photos now due to covid.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: