Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wonder if they will push for client side scanning for CSAM material again, since photos are covered under end to end encryption based on this announcement. As a consumer, it feels like two different teams with two different ideas of what kind of consumer privacy should be protected are trying to guide Apple in opposite directions.

Apple, the client side scan pushing and ad platform expanding company is now the same company that is releasing strengthened cloud data protection. Deduplication becomes impossible at any sort of scale and for safety Apple even turns off web access to iCloud when E2E cloud protection is turned on for the first time.

Apple has stated it will cache thumbnails using standard protections when sharing files, using "anyone with a link" will expose the unencrypted data to Apple servers. I wonder if CSAM scanning can take place for those files only.



According to The Washington Post [0], "In a second victory for privacy advocates, Apple said it was dropping a plan to scan user photos for child sex abuse images. The company had paused that plan shortly after its announcement last year, as security experts argued that it would intrude on user’s device privacy and be subject to abuse."

[0]: https://www.washingtonpost.com/technology/2022/12/07/icloud-...


Thank you for the link, I had not come across that news. It seems like Apple is still scanning photos when NSFW photos are sent to phones belonging to minors.

"When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. Similar protections are available if a child attempts to send photos that contain nudity. In both cases, children are given the option to message someone they trust for help if they choose.

Messages analyzes image attachments and determines if a photo contains nudity, while maintaining the end-to-end encryption of the messages. The feature is designed so that no indication of the detection of nudity ever leaves the device. Apple does not get access to the messages, and no notifications are sent to the parent or anyone else."

https://www.apple.com/child-safety/


It was client-side scanning only for stuff that was going to their servers, right?


Yes, and it was likely directly related to subsequently offering E2EE backups. Not "two different teams with two different visions".


They are offering E2EE despite not currently having plans for client side scanning of content. I have to imagine it's different teams because I want to give the encryption team the benefit of the doubt.

I can't imagine people working on E2EE at Apple would be okay with client side scanning. The reasoning isn't important, it's an easy slippery slope once implemented. I imagine the encryption team has to constantly push for consumer privacy in a climate where privacy is challenged and compromised for ad companies and governments. I would be absolutely shocked if there wasn't a large amount of internal pushback when the old CSAM detection plan was first announced publicly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: