Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But results about matches don't stay on the phone, which I think is clearly a violation of the statement (unless you are interpreting it in an extremely literal way).


It only gets sent to Apple if you turned on iCloud photo syncing to send the photos to Apple.

That means the alternative would be to send the photos to Apple and Apple scans the photo. Either way you send the photo to Apple and meta data gets generated about CSAM. It’s just a matter of where the data gets generated.

I’m also uneasy about it happening on the phone. But honestly, by it being processed on the phone, that means it can be encrypted before it gets to Apple’s servers.

I’m basically working under the assumption that scanning for CSAM is legally required.


> I’m basically working under the assumption that scanning for CSAM is legally required.

It is explicitly not legally required in the US [1]. Providers are required to report "apparent CSAM" that they find on their own, but they are not compelled to search their servers or private devices for its presence.

And this is the case for a very good reason: if it was mandated by US law, then prosecutions would be subject to much stronger 4th amendment review under the "state action doctrine" (i.e., the companies are searching your files without probable cause as compelled representatives of the government.) The current arrangement evades this review under the very thin fig-leaf that US providers are doing the searching on their own.

[1] https://crsreports.congress.gov/product/pdf/LSB/LSB10713


FOSTA/SESTA and other law push back on that, wherein a neutral host (website, hotel) can be held responsible for crimes commited on their property if the government decides they are generally aware. Apple doesn't want to be an accessory. So even if they can't be required to scan, they can be punished for not scanning if something illegal turns up


IANAL and certainly don't want to defend those laws, but I believe FOSTA/SESTA ban providers from operating services with the intent to promote or facilitate various crimes. In other words, the provider has to knowingly distribute the material. I'm pretty sure that Apple encrypting its photo backup service would not satisfy these criteria, but if it did and the only way to comply with those laws was enforced CSAM scanning, then many CSAM prosecutions based on it would probably be tossed out.


As far as I know, it's not legally required, at least in the US, though I wouldn't be surprised if suggestions from gov behind the scenes were the inspiration for this. I guess the EU is in the process of trying to mandate something like this.

Which would be unfortunate. At that point, you won't be able maintain digital privacy from the govt w/o de-facto becoming a criminal.

CSAM is, I think, simply the initial justification for these systems, since it's widely reviled. But the system itself is not CSAM-specific, and the temptation to expand its scope will likely be irresistible.

If your goal was to become an authoritarian tyrant, you would be very happy to have this in place. :-)


> But results about matches don't stay on the phone

Results about matches aren’t ever on the phone in the first place. Only the server can determine if there are any matches.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: