Call me crazy, but I don’t think corporations should be in the business of scanning everyone’s private data on behalf of “the authorities”.
Too many ways it can go very wrong.
Also, I feel like there’s probably not much of that on apples servers… Like wouldn’t that mind if cloud network be the last place you’d want to put illegal pictures? If I was trying to hide some felony pics or data I wouldn’t be trusting a large corporations cloud services.
All big tech analyzes our data. I rather they not analyze anything but since we’ll never see that day, they can at least use their privacy invasion power for good.
The thing is that while many companies have access to your data in various services, Apple has designed their systems such that they can’t access most user data. Can’t be both ways, your data is either private or not, and many would prefer it stay private.
As I understand the actual situation with iCloud and CSAM scanning is Apple does scan iCloud photos (the ones that users choose to upload to iCloud) if they can. A few years ago they tried to design a privacy focused version of that scanning that would allow them to access that kind of content for the purposes of reporting it, while preserving the users privacy. It was supposed to happen on device(while most companies only scan the photos on their servers) before the photos were uploaded, and use hashes to compare user photos to known CSAM material. This seemed an odd thing at the time, but a while after that Apple released end to end encryption for iCloud Photos, which means they can’t scan the uploaded photos anymore because they don’t have that access. Some have a theory that the big tech companies have regular contact with various government/law enforcement/etc. agencies and the on device scanning was a negotiated by them as a response to Apple’s plans to add E2E encryption to iCloud Photos, among other previously less secure services.
There’s also no way to validate that Apple’s E2EE operates as stated. They could have added a backdoor for themselves or “intelligence” agencies, and we have no way of knowing other than “trust us”. Even if the source code is ever leaked (or a backdoor exploited by hackers), it could be written with plausible deniability — in such a way that it could be interpreted as unintentional (a bug/error).
This is why you should never trust closed source code with your sensitive data, and encrypt it yourself using open source, widespread/trusted, audited tools before uploading it to someone else’s computer.
Some nits: Apple could access many classes of data stored on iCloud by default (including any photos), even now, but you can make almost every class end to end encrypted now if you explicitly chose to. Previously, and by default now, it’s Apple policy and internal controls over the keys your data is encrypted with that protect that data, not the encryption itself (though you can opt in to the encryption itself protecting you from Apple). From what I understand, Apple is only known to actually scan iCloud mailboxes regularly, with the on-device scanning having never been implemented. Outside of nits, considering the delay between the proposed scanning and offering of a wider E2EE program for iCloud, I doubt the two are actually related myself.
This is exactly what Apple wanted to do and lots of people (myself included) were against that because it would involve Apple scanning data on your phone. Sure, it was only at the point to deciders to upload photos to the cloud, but still it was unacceptable to scan our phones for data that hasn’t been uploaded yet.
Suppression of political messages the company doesn’t agree with?
This title is misleading click bait for an article advocating for intrusive data scanning, which by the way, cannot be completely automated.
Here’s a snippet of the iCloud TOS which specifically forbids CSA on iCloud.
You agree that you will NOT use the Service to:
a. upload, download, post, email, transmit, store, share, import or otherwise make available any Content that is unlawful, harassing, threatening, harmful, tortious, defamatory, libelous, abusive, violent, obscene, vulgar, invasive of another’s privacy…
Further down, the same TOS specifically calls out that such content may be identified or removed by Apple
Again, not defending Apple, but I’d rather not have them or an army of underpaid contractors search through people’s pictures as a type of corporate law enforcement, because “think of the children”. This is a systemic problem which can be addressed without invading EVERYONE’s privacy
Illegal drugs are transported on the interstate. The department of transportation should be held accountable for not searching every vehicle.
If that is true provide the evidence to the FBI and they’ll arrest People. That isn’t what is done because they have no evidence and breaking encryption is the real goal.