Ok…so I’m aware there is a feature “check for sensitive media” that parents can turn on and AI can send an alert to you if it seems like your kid might be texting nude pics….only works with iMessage since apple doesn’t have access to photos in other apps. No human sees the photos. But that isn’t the same as what you’re saying and I don’t know if what you’re saying is accurate.
https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/
This is what I’m talking about.
And the issue with that parental control is that say you’re gay kid in Iran that send nudes to your boyfriend which Apple then reports to your ultra conservative parents. That’s not going to end good for you.
Apple Kills Its Plan to Scan Your Photos for CSAM
That headline literally says they’re not doing that. It was a well-meaning initiative that they rightfully backed down on when called out.
I am one of the first to typically assume malice or profit when a company does something, but I really think Apple was trying to do something good for society in a way that is otherwise as privacy-focused as they could be. They just didn’t stop to consider whether or not they should be proactive in legal matters, and when they got reamed by privacy advocates, they decided not to go forward with it.
Good on them for canceling those plans but they only did so because of the massive public outcry. They still intended to start scanning your photos and that is worrying.
However I’m not denying that it’s probably still the most privacy focused phone you can get. For now.
i mean, that’s a pretty niche case and maybe your underage kid shouldn’t be sending nudes via imessage anyways.
That’s a whole another discussion. It just one example anyways. My point still stands; this does not increase user privacy.