They’re hypocrites though. Branding themselves as privacy focused and in some cases actually being that too but at the same time also scanning your photos and messages and reporting to authorities/parents if there something inappropriate.
Inb4 no need to worry if you have nothing to hide -argument
Ok…so I’m aware there is a feature “check for sensitive media” that parents can turn on and AI can send an alert to you if it seems like your kid might be texting nude pics….only works with iMessage since apple doesn’t have access to photos in other apps. No human sees the photos. But that isn’t the same as what you’re saying and I don’t know if what you’re saying is accurate.
https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/
This is what I’m talking about.
And the issue with that parental control is that say you’re gay kid in Iran that send nudes to your boyfriend which Apple then reports to your ultra conservative parents. That’s not going to end good for you.
Apple Kills Its Plan to Scan Your Photos for CSAM
That headline literally says they’re not doing that. It was a well-meaning initiative that they rightfully backed down on when called out.
I am one of the first to typically assume malice or profit when a company does something, but I really think Apple was trying to do something good for society in a way that is otherwise as privacy-focused as they could be. They just didn’t stop to consider whether or not they should be proactive in legal matters, and when they got reamed by privacy advocates, they decided not to go forward with it.
i mean, that’s a pretty niche case and maybe your underage kid shouldn’t be sending nudes via imessage anyways.