Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Apple did not respond to a request for comment. “We have never built a back door or master key to any of our products, and we never will,” Apple said in February.

This must be some "technically correct" weasel words bullcrap, as without at least equivalent access there is no chance Apple would be operating in China.



Apple stores Chinese users' iCloud data and encryption keys to that data in China in a datacenter run by a state owned firm [1].

[1] https://www.reuters.com/article/technology/apple-moves-to-st...


Basically, there is no backdoor. The front door is wide open, the government just needs to ask. Or not even that -- just take whatever they need themselves.


Even in the US, even if Apple doesn’t have a backdoor, isn’t NSA linked up into the telecom companies already?


If there's a properly implemented end-to-end encryption, then NSA cannot see anything, even having full access to telco-s.


But iCloud is not fully encrypted by default.


Could you elaborate?


iCloud Advanced Data Protection (the feature that TFA is referring to) is required for E2EE, and it is not enabled by default.

https://en.wikipedia.org/wiki/ICloud#Advanced_Data_Protectio...

> On December 7, 2022, Apple announced Advanced Data Protection for iCloud, an option to enable end-to-end encryption for almost all iCloud data including Backups, Notes, Photos, and more. The only data classes that are ineligible for Advanced Data Protection are Mail, Contacts, and Calendars, in order to preserve the ability to sync third-party clients with IMAP, CardDAV or CalDAV.


But they can store the traffic and decrypt it later, if feasible.


^ this and remember kids. Data processing does not require a warrant.


> if the target uses iCloud backup, the encryption keys should also be provided with content return

https://s3.documentcloud.org/documents/21114562/jan-2021-fbi...


That is from before opt-in end-to-end encryption was added for iCloud backups.


Does Apple have any better proof than a whitepaper that they don't backup the keys anyways?


Several people have reverse engineered the protocol and clients. None have found any evidence that the keys are backed up anywhere as far as I know.


which you can verify because everthing is proprietary so it just there usual marketing play


The whole point of encryption in transit is that it doesn't matter if the telecom companies aren't trusted: they still can't read the data.


All the in-transit encryption in the world won't matter if they've pwned the decrypted client device.

Every company from your device's manufacturer, OS vendor, telecom carrier, app distributors and 3rd party software providers can be compelled to help make that happen.

And then there's always Cellebrite and friends.


This is not just encryption in transit or simplistic client-side encryption.

It is end-to-end encryption, where each device's key generation is handled by your phone's Secure Enclave.

This article is a decent starting point in terms of what Advanced Data Protection is:

https://support.apple.com/en-us/102651

If you want a deeper dive into the security engineering of iCloud Keychain, the second half of this Blackhat talk by Apple's head of Security Engineering & Architecture (SEAR) is really great:

Synchronizing secrets: https://youtu.be/BLGFriOKz6U?si=cY94TYo28bRj4G7y&t=1357


Does all of that matter if an attacker has access to your device and can take screenshots of your conversations, or read those conversations out of memory in their unencrypted state?


No it doesn't — that's a totally different threat model.

Advanced Data Protection is mostly concerned with protecting data from attackers on the server and in transit.

If you're interested in protections when an attacker has physical access to your device, you should read the "Encryption and Data Protection" section of Apple's Platform Security Guide.

Web: https://support.apple.com/guide/security/welcome/web

PDF: https://help.apple.com/pdf/security/en_US/apple-platform-sec...


In computer security if an adversary is having an unlimited physical access to hardware, it is considered a game over.


The difference is that if the NSA has physical access to my phone, I'm probably aware of it. It makes routine fishing expeditions across broad populations much harder and more expensive, as well as easier to oppose.

If they can fish remotely and automatically, accountability goes completely out the window.


I'm aware of what E2EE is, all the encryption in the world does not matter if either end of the conversation is confiscated or pwned by adversaries.


>all the encryption in the world does not matter if either end of the conversation is confiscated or pwned by adversaries.

Yes of course, but it's not so simple to bypass the hardware-enforced protections that exist both device side and server side. As far as I can tell, it seems effort was made to design/architect everything in such a way such that the protections can't be retroactively circumvented even under legal compulsion.

Disclosure: I previously worked for Apple, but not on the design/implementation of any of this stuff and this is all my own opinions, not those of Apple.


You thinking way too deep when the whole OS including implementation of the E2EE is proprietary and could be silently and targeted exchanged for a backdoored variant if it's not by default.


A good portion of HN believes Apple would never do that so I wasn't going to bother with that angle because of the inevitable defensive posts it would generate in response.


Presumably the data is not that relevant; former NSA director Michael Hayden said: "We kill people based on metadata."


What are the chances that the NSA has a useful zero-day on the TLS encryption standard?


Probably not on the standard itself, but practically a guarantee they have attacks on the major implementations, especially OpenSSL.


What are the chances they just have everyone's private keys?


"We have never built." ok, so then who built it?


Go further: "We have never built a back door or master key"

The feature is not named "back door" or "master key". It's a feature of iMessage with various names such as "zero click".

Also note they have created features before that provides law enforcement access to data at different stages of a pipeline.

https://en.wikipedia.org/wiki/IMessage#Security_and_privacy


> It's a feature of iMessage with various names such as "zero click".

All web results for `imessage "zero click"` are about vulnerabilities/exploits. Are you claiming some of these are intentional, or what?


It's been used by state actors for years. May as well publish the API for it. These aren't rare and special flowers that are each artisanally discovered after searching for a million hours. And it is nicely packaged in an app that is used for something simple like text messages.

"An Israeli spyware company has reportedly cut access to its clients in Italy following allegations that its product was used to target critics of the Italian government.

"The move comes after WhatsApp alleged last week that spyware made by Paragon Solutions was used to target 90 WhatsApp users in two dozen countries, including journalists and civil society members.

"Italy's government confirmed in a statement on Wednesday that seven mobile phone users in the country had been targeted by spyware on WhatsApp, calling the incident "particularly serious"."

https://www.bbc.com/news/articles/cvgmzdjw24yo


Why are you quoting an article about WhatsApp after making a claim about iMessage and what Apple builds in?

(Not that the article has any evidence supporting the same claim about WhatsApp, either. In fact, the article describes WhatsApp/Meta claiming to have "disrupted" the spying and reporting it to the Italian government.)


Is it a big deal though? Context matters. Everyone knows you don't do business in China without bending a knee to the government in all things. If you don't you are shut down completely if you're lucky, imprisoned if not. Of course CCP has access to every device in China approaching very close to 100%


What are you saying, I'm in China it's completely false. There is very lax enforcement, and if there is enforcement, there's always an opportunity to bribe.

The communist party doesn't have "access" to a billion phones: what would they even do with all that garbage. They can ask Wechat what you send your friends, sure, and that's enough to police most crimes.

We don't have the budget or organizational acumen of the NSA.


Access != "saving it all, all the time" . If they determine you are an anti-CCP agent then they will get access to everything in your life, no warrants or requests involved.


Well, allegedly Apple could have designed, advised and tested the solution, and if a contractor builds it, then the statement is technically correct.

Apple (and many other organizations) contracts work out for liability reasons, this is not the first instance of it.

Also note the language - specifically a “back door” or “master key”. If you call it something else, literally anything else, the statement holds up.


Oops, I accidentally dropped this document on your desk marked CONFIDENTIAL. Please don't read it


> This must be some "technically correct" weasel words bullcrap

Is that even necessary? A gag order means they can't reveal backdoors, and their entire stack is so locked down that discovering them is hard and unlikely.


Gag order cannot force them to lie. A lie like that can be illegal even.

If there's a gag order, then companies say "we have a gag order". Like Google and Twitter did back in the day when asked. And then immediately started releasing Transparency Report to show how many of the gag orders they receive, so gov't couldn't say "we don't request anything".

https://transparencyreport.google.com/user-data/overview


I don't think this is true. There was the whole fiasco around the EULA canary that a company had that, if it was removed, was supposed to indicate that the company had received a gag order. I don't believe it had been actually tested in court. I believe it had to do with the FISA court system. I can't find a link but there are definitely cases of gray area where courts have required entities to lie. https://en.m.wikipedia.org/wiki/Warrant_canary


This is absolutely a thing. For comparison, banking regulation basically makes banks extensions of the intelligence services and forces them to tell off law enforcement in case "suspicious" account activities are detected. Usually banks will also close the account, but the regulations ensure the customer will never find out why because the bank will stonewall them to ensure they don't accidentally tattle.

https://www.bitsaboutmoney.com/archive/debanking-and-debunki...

Even if no formal regulation to that effect exists for social media companies, intelligence agencies presumably have sufficient leverage to ensure similar access.


> Gag order cannot force them to lie.

Yes it can, thats the whole point.


No, where did you get that?

They can "No comments" all the way, but there's no way to legally forcing a commercial company to lie. The company can lie, of course, but it's their choice, not an order.


Maybe it's a different type of a gag order? Subpoena prohibits to say "we have a gag order".


Most likely the spokesperson just forgot about China. They're still a US country, after all, and US-centric thinking is the default.


In China they outsource that version of the iCloud services to a Chinese company hosted in China. The rest of the world is on Apple’s.


This. Just have a demented spokesperson.


They may not have built it, but it doesn’t mean they didn’t implement something built for them.


I mean they just disabled advanced data protection which allowed normal law enforcement requests to access the data since they are not e2e encrypted if you don't use advanced data protection. I really don't think they needed to implement a new backdoor. They would just need implement a procedure that would fast track UK requests.


That didn’t placate the UK government, because it didn’t just want access to British users’ data – it wanted access to any Apple user’s data from anywhere in the world.

I suspect that disabling advanced data protection in the UK was meant to let Apple say it was complying as far as it could while fighting the main order.


You still dont know who was lobbying the UK government to ask for this.


Apple didn't disable advanced data protection for those who had enabled it. Rather they removed the ability for new users to turn it on in the first place.

A future update was going to ask users themselves to disable it in order to continue backing up their phones to iCloud.


AKA "fuckin...england fucking chill, don't blow this. You're FVEY mate, why you go scaring everyone when you know I got this. Just relax, you really think TAO don't got this? The Equation Group? We got this, in there like swimwear."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: