In fact, rumour has it that it was pressure from the FBI that caused Apple to abandon plans for fully encrypted iCloud Backups. More importantly, however, since this data isn’t encrypted, Apple can and does provide it to law enforcement and other government agencies, since it has to comply with local laws. To the best of our knowledge, Apple doesn’t scan photos within iCloud Backups since everything is just stored as a big blob of data, your photos aren’t indexed by Apple, although it would still be possible for Apple to scan them if it really wanted to. If you’re not using iCloud Photo Library, all the photos on your device will be included in your iCloud Backups by default (unless you turn them off - see below). In fact, the same logic applies to iCloud Backups, which aren’t encrypted in any way either. The newly announced CSAM Detection features simply move that scanning to the user’s device, checking photos before they’re uploaded to iCloud. Horvath was asked about whether content uploaded to iCloud should be screened for CSAM, but she responded rather obliquely by saying Apple was “utilizing some technologies to help screen for child sexual abuse material.” However, Apple recently clarified to Ben Lovejoy at 9to5Mac that this was in reference to scanning iCloud Mail attachments, which have always been completely unencrypted to begin with - even “at rest” on Apple’s servers. Update: It appears that the comments Jane Horvath made during the Chief Privacy Officer Roundtable at CES 2020 were misconstrued. Apple already has full access to every photo you store in iCloud, and has been scanning those for CSAM in the cloud for at least two years now - and probably quite a bit longer. This is ultimately why much of the controversy over Apple’s new CSAM Detection features is a tempest in a teapot. This type of encryption is designed to prevent disclosure as a result of data breaches, but it doesn’t truly protect your privacy from Apple or anybody else who can legally compel Apple to give them access to your data. This means that although they are stored on Apple’s servers in encrypted form, this is done using a generic encryption key that Apple has access to. While everything on your iPhone is securely encrypted - to the point where Apple is routinely pilloried by the FBI and lawmakers for making it too secure - this is not actually the case with much of what you store in iCloud.Īpple uses end-to-end encryption (E2EE) only for a few specific areas of particularly sensitive content, such as your passwords, health data, and HomeKit data, but vast swaths of personal information, including your iCloud Backups and your iCloud Photo Library are merely “encrypted at rest.” Technically speaking, Apple doesn’t force anybody to use iCloud, but it most definitely encourages it when you set up a new iPhone, almost everything in iCloud is automatically enabled - including iCloud Backups and iCloud Photo Library. Since iCloud is effectively on by default for most iPhone users, this means that there are a lot of things happening on your iPhone that are not actually staying on your iPhone. Yes, what happens on your iPhone stays on your iPhone - until it leaves your iPhone to go into iCloud. To be fair, that statement is technically true, but it carries a level of nuance that many users don’t necessarily understand. After all, Apple is the company that posts strong privacy marketing on billboards for all to see, with the most famous of them being the Vegas-inspired slogan that, “What Happens on Your iPhone, stays on your iPhone.” It’s an understandable belief, of course. While Apple has been going to great lengths to assuage these concerns - even to the point of admitting that it messed up on how the new feature was announced - the initiative has also shown how many folks have erroneously believed that the information they store on their iPhone was inherently secure to begin with. Unfortunately, it’s also stoked the fears of many iPhone users, who fear their devices are becoming participants in a new “surveillance state.” Apple’s recent and highly controversial announcement that it will soon implement a new way of scanning for Child Sexual Abuse Material (CSAM) in iCloud Photos has also prompted some interesting and important discussions about privacy.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |