Trump is a red herring here. It's the Thiel and Altman connections that are significant.
This guy is married to one of the Paypal mafia, He's worked for Palantir, and Altman officiated his wedding.
He entered government while retaining investments in OpenAI, Anduril, SpaceX, the Boring Company and Neuralink. These are all Thiel or Musk connected companies and they stand to benefit directly from his policy decisions.[0]
Well... 19th century engineer could have a large multi-story brownstone with family and , more importantly, servants and house personnel. A butler, etc...
Your nose is literally a special flower. What smells good to it may not to another and vice versa. I far prefer the smell of pot smoke on the sidewalk to the smell of tobacco smoke. You youngsters missed the years of indoor workplace smoking and smoke breaks with 20 smokers surrounding the office entry door. It's just another smell to you. But for those of us who lived through the bad days of smoking, it's a toxic soup, a smoke inferno hell pit we're not thrilled about revisiting right outside of our favorite restaurant. A little bit of grass burning, no big deal. A cigarette and my meal's ruined.
Canyons can be a challenge. To maybe paraphrase some signage along the way. Down is optional. Up is not.
Going down to the river makes for a very long day. I've boated (part raft, part other) down the canyon but I've only hiked down to a spot part of the way and then back.
They're much more expensive than traditional silicon cells, they often use toxic materials (lead, cadmium, etc), and IIRC their lifespans aren't as long. Unless you have significant space constraints it's usually better just to get twice as many traditional panels.
Doesn't this mean that no matter how securely your phone is locked, Apple (and probably the three-letter agencies) can always unlock it by installing an appropriate update?
Not necessarily. If the secret is protected in the secure element against something only you can provide (physical presence of RFID, password, biometric etc) then it is ok.
BUT you must trust the entire Apple trusted chain to protect you.
> If the secret is protected in the secure element against something only you can provide (physical presence of RFID, password, biometric etc) then it is ok.
But we already established unlocking is not possible, so going with the argument it's implied there is a side-channel. Nothing, but a secret in your brain is something only you can (willingly) provide. Especially not biometric data, which you distribute freely at any moment. RFID can be relayed, see carjacking.
If you can side-step the password, to potentially install malware/backdoor, that's inherently compromising security.
If the data you care about is encrypted with a token locked behind your passcode input, and it's not theoretically brute forceable by being a 4 character numeric only thing, then not easily, no.
Could they produce an update that is bespoke and stops encrypting the next time you unlock, push it to your phone before seizing it, wait for some phone home to tell them it worked, and then grab it?
Perhaps, but the barrier to making Apple do that is much higher than "give us the key you already have", and only works if it's a long planned thing, not a "we got this random phone, unlock it for us".
(It's also something of a mutually-assured destruction scenario - if you ever compel Apple to do that, and it's used in a scenario where it's visibly the case that 'the iPhone was backdoored' is the only way you could have gotten that data, it's game over for people trusting Apple devices to not do that, including in your own organization, even if you somehow found a legal way to compel them to not be permitted to do it for any other organization.)
> Perhaps, but the barrier to making Apple do that is much higher than "give us the key you already have", and only works if it's a long planned thing, not a "we got this random phone, unlock it for us".
The attack situation would be e.g. at the airport security check, where you have to part with your device for a moment. That's a common way for law enforcement and intelligence to get a backdoor onto a device. Happens all the time. You wouldn't be able to attribute it to Apple collaborating with agencies or them using some zero-day exploit. For starters, you likely wouldn't be aware of the attack at all. If you came home to a shut-down phone, would you send your 1000$ device to some security researcher thinking it's conceivably compromised, or just connect it to a charger?
If you can manually install anything on a locked phone, that's increasing the attack surface, significantly. You wouldn't have to get around the individual key to unlock the device, but mess with the code verification process. The latter is an attractive target, since any exploit or leaked/stolen/shared key will be potentially usable on many devices.
Part of the reason e.g. Cellebrite is obsessive about not telling people many specifics about their product capabilities outside of NDA is that Apple is quite serious about trying to fix these things, and "we can crack every iPhone before the 14" probably tells them a fair bit about what might have a flaw.
Tools like that lose a lot of value if anyone paying enough attention can infer they exist, even indirectly, like if all the TSA agents you know suddenly switch to Android phones, or some of them tell you not to bring iPhones through security and won't tell you why, or a thousand other vectors for rumors to start.
All it takes is enough rumors for people to say it's enough to not trust any more, and suddenly you've lost a lot of the value of a secret information source.
So if you have a tool like that, where most people don't think it's readily available, the way you probably use it is very sparingly, to keep it that way.
There is a difference in targeted software supply attacks vs. weakening encryption for everyone by introducing a master key. Apple would be required to cooperate by US law, it may never become public either. But as I said, Apple doesn't have to know, or "know". This feature inherently compromises security. Contrary to device encryption, OS update security depends on a single key held by Apple (rather several devOps guys...), which could be stolen, leaked or shared.
Would you bet, the NSA can't sign iOS updates?
> So if you have a tool like that, where most people don't think it's readily available, the way you probably use it is very sparingly, to keep it that way.
Of course. This is reserved for targeted attacks against journalists and other enemies of the state.
> All it takes is enough rumors for people to say it's enough to not trust any more, and suddenly you've lost a lot of the value of a secret information source.
None of those articles are inconsistent with the claim that Apple cares about security, though?
"We can be legally compelled to give up data we have" and "we thought letting people have custom kernel modules was a bigger threat" are not particularly incompatible with "we design things so we don't have keys to your data we can be compelled to give up" and valuing people's security. (I am not a fan of the latter, to be clear, but there are reasonable reasons you could argue for it.)
But yes, I would probably, at the moment, bet that if the NSA can sign a custom iOS build on consumer hardware, Apple doesn't know about how, both because that's a very hard secret to keep, and because you'd see a massive uptick in people avoiding Apple devices in governments that might be of interest to US intelligence if even a rumor of that got out.
> None of those articles are inconsistent with the claim that Apple cares about security, though?
You are moving the goalpost.
> "We can be legally compelled to give up data we have" and "we thought letting people have custom kernel modules was a bigger threat" are not particularly incompatible with "we design things so we don't have keys to your data we can be compelled to give up" and valuing people's security. (I am not a fan of the latter, to be clear, but there are reasonable reasons you could argue for it.)
They do have the signing keys your iPhone will gladly accept to circumvent encryption, which is the argument.
I'm not the one moving the goalpost; my argument was that Apple's incentives are not in favor of them permitting even the appearance that they might allow that kind of compromise, your argument with that wall of articles appeared to be that Apple has a history of making decisions inconsistent with that, which I disputed. If that wasn't your intended argument, you might wish to be more explicit than a wall of links and "As if Apple users would care...".
> They do have the signing keys your iPhone will gladly accept to circumvent encryption, which is the argument.
Yes, and my argument is that the plumbing for either multiple release signing keys, one of which is never seen in the wild, or to avoid a second "iOS 13.1.5" or whatever with different build information showing up in various telemetry that would leak this existing, is very difficult to have built without far too many people who would spread rumors about it coming about, and even that rumor would be a problem.
So the most plausible thing, to me, would be that if such a capability exists, it's a "nuclear option" for whoever holds it to only use in a circumstance where it's so important they don't mind potentially never being able to use it again, whether that's because it's an exploit chain that will be fixed or because it's been coerced out of the target company and they will probably be compelled to fix it if it gets out.
The Phoenix contract predates the more recent efforts to switch to FOSS.
But also, Canada loves to burn money on American suppliers. It's probably why the recent interest in _Buy Canadian_ has the American administration annoyed.
Phoenix was a literal trap laid by the Conservative government just before leaving knowing it would be a shit show for the Liberals in the coming years.
You did not get the memo?
reply