The FBI was able to access Washington Post reporter Hannah Natanson's Signal messages because she used Signal on her work laptop. The laptop accepted Touch ID for authentication, meaning the agents were allowed to require her to unlock it.
Being held in contempt at least means you got a day in court first. A judge telling me to give up my password is different than a dozen armed, masked secret police telling me to.
I actually think it is fitting to read about a government agency weaponized by an unscrupulous billionaire going after journalists working for an unscrupulous billionaire on an unscrupulous trillionaire owned platform.
Dunno, I'd rather have unabused kids than the technological breakthroughs he has contributed to. Anyone being giddy to meet with a convicted pedo is very sus in my books, and deserves no respect, regardless of their prior contributions.
Children were exploited, and we're doing this net positive analysis on whether he should face the scorn. I'm not having a go at you - it's just frustrating to see very little happening after so much has been exposed, and I think part of it comes from this mindset - 'oh he's a good guy, this is a mistake/misstep' while people that were exploited as children can't even get their justice.
Maybe. I don't think we yet have a good understanding of how many deaths he will have caused as a result of DOGE so abruptly cutting off assistance to so many vulnerable people around the world, but I've heard estimates hover around 600,000.
Assuming that number turns out to be close to reality, how do you weigh so many unnecessary deaths against VTL rockets and the electric cars?
Perhaps a practitioner of Effective Altruism could better answer that question.
Is the knowledge of which finger to use protected as much as a passcode? Law enforcement might have the authority to physically hold the owner's finger to the device, but it seems that the owner has the right to refuse to disclose which finger is the right one. If law enforcement doesn't guess correctly in a few tries, the device could lock itself and require the passcode.
Another reason to use my dog's nose instead of a fingerprint.
There's only ten possible guesses, and most people use their thumb and/or index finger, leaving four much likelier guesses.
Also, IANAL, but I'm pretty sure that if law enforcement has a warrant to seize property from you, they're not obligated to do so immediately the instant they see you - they could have someone follow you and watch to see how you unlock your phone before seizing it.
I really wish Apple would offer a pin option on macos. For this reason, precisely. Either that, or an option to automatically disable touchid after a short amount of time (eg an hour or if my phone doesn't connect to the laptop)
You can setup a separated account with a long password on MacOS and remove your user account from accounts that can unlock FileVault. Then you can change your account to use a short password. You can also change various settings regarding how long Mac has to sleep before requiring to unlock FileVault.
As another alternative, rather than using Touch ID you can setup a Yubikey or similar hardware key for login to macOS. Then your login does indeed become a PIN with 3 tries before lockout. That plus a complex password is pretty convenient but not biometric. It's what I've done for a long time on my desktop devices.
0.1 in itself is a very good odd, and 0.1 * n tries is even more laughable. Also most people have two fingers touchID, which makes this number close to half in reality.
This command will make your MacBook hibernate when lid is closed or the laptop sleeps, so RAM is written to disk and the system powers down. The downside is that it does increase the amount of time it takes to resume.
A nice side benefit though, is that fingerprint is not accepted on first unlock, I believe secrets are still encrypted at this stage similar to cold boot. A fingerprint still unlocks from screensaver normally, as long as the system does not sleep (and therefore hibernate)
Or squeeze the power and volume buttons for a couple of seconds. It’s good to practice both these gestures so that they become reflex, rather than trying to remember them when they’re needed.
Sad, neither of those works on Android. Pressing the power button activates the emergency call screen with a countdown to call emergency services, and power + volume either just takes a screenshot or enables vibrations/haptics depending on which volume button you press.
It's close enough, because (most of) the encryption keys are wiped from memory every time the device is locked, and this action makes the secure enclave require PIN authentication to release them again.
Did you know that on most models of iPhone, saying "Hey Siri, who's iPhone is this?" will disable biometric authentication until the passcode is entered?
Serious question. If I am re-entering the US after traveling abroad, can customs legally ask me to turn the phone back on and/or seize my phone? I am a US citizen.
Out of habit, I keep my phone off during the flight and turn it on after clearing customs.
my understanding is that they can hold you for a couple days without charges for your insubordination but as a citizen they have to let you back into the country or officially arrest you, try to get an actual warrant, etc.
If you are a US citizen, you legally cannot be denied re-entry into the country for any reason, including not unlocking your phone. They can make it really annoying and detain you for a while, though.
This is the third person advocating button squeezing, as a reminder: IF a gun is on you the jig is up, you can be shot for resisting or reaching for a potential weapon. Wireless detonators do exist, don't f around please.
In case anyone is wondering: In newer versions of MacOS, the user must log out to require a password. Locking screen no longer requires password if Touch ID is enabled.
As if the government is not above breaking the law and using rubber hose decryption. The current administration’s justice department has been caught lying left and right
I find it so frustrating that Lockdown Mode is so all-or-nothing.
I want some of the lockdown stuff (No facetime and message attachments from strangers, no link previews, no device connections), but like half of the other ones I don't want.
Why can't I just toggle an iMessage setting for "no link preview, no attachments", or a general setting for "no automatic device connection to untrusted computers while locked"? Why can't I turn off "random dickpicks from strangers on iMessage" without also turning off my browser's javascript JIT and a bunch of other random crap?
Sure, leave the "Lockdown mode" toggle so people who just want "give me all the security" can get it, but split out individual options too.
Just to go through the features I don't want:
* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more
* Shared photo albums - I'm okay viewing shared photo albums from friends, but lockdown mode prevents you from even viewing them
* Configuration profiles - I need this to install custom fonts
Apple's refusal to split out more granular options here hurts my security.
>* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more
This feature has the benefit of teaching users (correctly) that browsing the internet on a phone has always been a terrible idea.
"Lockdown Mode is a sometimes overlooked feature of Apple devices that broadly make[sic] them harder to hack."
Funny to see disabling "features" itself described as "feature"
Why not call it a "setting"
Most iPhone users do not change default settings. That's why Google pays Apple billions of dollars for a default setting that sends data about users to Google
"Lockdown Mode" is not a default setting
The phrase "sometimes overlooked" is an understatement. It's not a default setting and almost no one uses it
If it is true Lockdown Mode makes iPhones "harder to hack", as the journalist contends, then it is also true that Apple's default settings make iPhones "easier to hack"
Sadly, they still got to her Signal on her Desktop – her sources might still be compromised. It's sadly inherent to desktop applications, but I'm sad that a lot more people don't know that Signal for Desktop is much, much less secure against adversaries with your laptop.
The key in the desktop version is not always stored in the secure enclave, is my assumption (it definitely supports plaintext storage). Theoretically this makes it possible to extract the key for the message database. Also a different malicious program can read it. But this is moot anyway if the FBI can browse through the chats. This isn't what failed here.
I would have thought reporters with confidential sources at that level would already exercise basic security hygiene. Hopefully, this incident is a wake up call for the rest.
That's a strong statement. Also imho it's important that we use Signal for normal stuff like discussing where to get coffee tomorrow - no need for disappearing messages there.
Yea, I also would want to question the conclusions in the article. Was the issue that they couldn't unlock the iPhone, or that they had no reason to pursue the thread? To my understanding, the Apple ecosystem means that everything is synced together. If they already got into her laptop, wouldn't all of the iMessages, call history, and iCloud material already be synced there? What would be the gain of going after the phone, other than to make the case slightly more watertight?
Is there an implication here that they could get into an iPhone with lower security settings enabled? There's Advanced Data Protection, which E2EEs more of your data in iCloud. There's the FaceID unlock state, which US law enforcement can compel you to unlock; but penta-click the power button and you go into PIN unlock state, which they cannot compel you to unlock.
My understanding of Lockdown Mode was that it babyifies the device to reduce the attack surface against unknown zero-days. Does the government saying that Lockdown Mode barred them from entering imply that they've got an unknown zero-day that would work in the PIN-unlock state, but not Lockdown Mode?
> Natanson said she does not use biometrics for her devices, but after investigators told her to try, “when she applied her index finger to the fingerprint reader, the laptop unlocked.”
I want to say that is generous of her, but one thing that is weird is if I didn’t want someone to go into my laptop and they tried to force me to use my fingerprint to unlock it, I definitely wouldn’t use the finger I use to unlock it on the first try. Hopefully, Apple locks it out and forces a password if you use the wrong finger “accidentally” a couple of times.
There appear to be a relatively few possibilities.
* The reporter lied.
* The reporter forgot.
* Apple devices share fingerprint matching details and another device had her details (this is supposed to be impossible, and I have no reason to believe it isn't).
* The government hacked the computer such that it would unlock this way (probably impossible as well).
* The fingerprint security is much worse than years of evidence suggests.
Mainly it was buried at the very end of the article, and I thought it worth mentioning here in case people missed it.
My opinion is that she set it up, it didn't work at first, she didn't use it, forgot that it existed, and here we are.
> Apple devices share fingerprint matching details and another device had her details
I looked into it quite seriously for windows thinkpads, unless Apple do it differently, you cannot share fingerprint, they're in a local chip and never move.
The reporter lying or forgetting seems to be the clear answer, there's really no reason to believe it's not one of those. And the distinction between the two isn't really important from a technical perspective.
Fingerprint security being poor is also unlikely, because that would only apply if a different finger had been registered.
She has to have set it up before. There is no way to divine a fingerprint any other way. I guess the only other way would be a faulty fingerprint sensor but that should default to a non-entry.
The fingerprint sensor does not make access control decisions, so the fault would have to be somewhere else (e.g. the software code branch structure that decides what to do with the response from the secure enclave).
Could be a parallel construction type thing. They already have access but they need to document a legal action by which they could have acquired it so it doesn't get thrown out of court.
I think this is pretty unlikely here but it's within the realm of possibility.
I think they mean if they already have her fingerprint from somewhere else, and a secret backdoor into the laptop. Then they could login, setup biometrics and pretend they had first access when she unlocked it. All without revealing their backdoor.
It seems unfortunate that enhanced protection against physically attached devices requires enabling a mode that is much broader, and sounds like it has a noticeable impact on device functionality.
I never attach my iPhone to anything that's not a power source. I would totally enable an "enhanced protection for external accessories" mode. But I'm not going to enable a general "Lockdown mode" that Apple tells me means my "device won’t function like it typically does"
There is a setting as of iOS 26 under "Privacy & Security > Wired Accessories" in which you can make data connections always prompt for access. Not that there haven't been bypasses for this before, but perhaps still of interest to you.
GrapheneOS does this by default - only power delivery when locked. Also it's a hardware block, not software. Seems to be completely immune to these USB exploit tools.
It also has various options to adjust the behaviour, from no blocks at all, to not even being able to charge the phone (or use the phone to charge something else) -- even when unlocked. Changing the mode of operation requires the device PIN, just as changing the device PIN does.
Note that it behaves subtly differently to how you described in case it was connected to something before being locked. In that case data access will remain -- even though the phone is now locked -- until the device is disconnected.
Can a hacked phone (such as one that was not in Lockdown Mode at one point in time) persist in a hacked state?
Obviously, the theoretical answer is yes, given an advanced-enough exploit. But let's say Apple is unaware of a specific rootkit. If each OS update is a wave, is the installed exploit more like a rowboat or a frigate? Will it likely be defeated accidentally by minor OS changes, or is it likely to endure?
This answer is actionable. If exploits are rowboats, installing developer OS betas might be security-enhancing: the exploit might break before the exploiters have a chance to update it.
Forget OS updates. The biggest obstacle to exploit persistence: a good old hard system reboot.
Modern iOS has an incredibly tight secure chain-of-trust bootloader. If you shut your device to a known-off state (using the hardware key sequence), on power on, you can be 99.999% certain only Apple-signed code will run all the way from secureROM to iOS userland. The exception is if the secureROM is somehow compromised and exploited remotely (this requires hardware access at boot-time so I don't buy it).
So, on a fresh boot, you are almost definitely running authentic Apple code. The easiest path to a form of persistence is reusing whatever vector initially pwned you (malicious attachment, website, etc) and being clever in placing it somewhere iOS will attempt to read it again on boot (and so automatically get pwned again).
But honestly, exploiting modern iOS is already difficult enough (exploits go for tens millions $USD), persistence is an order of magnitude more difficult.
You should read into IOS internals before commenting stuff like this. Your answer is wrong, and rootkits have been dead on most OS's for years, but ESPECIALLY IOS. Not every OS is like Linux where security is second.
Even a cursory glance would show it's literally impossible on IOS with even a basic understanding.
Re: reboots – TFA states that recent iPhones reboot every 3 days when inactive for the same reasons. Of course, now that we know that it's linked to inactivity, black hatters will know how to avoid it...
If they're not investigating her she doesn't have any 5th-amendment protection and can be compelled to testify on anything of relevance, including how to unlock her devices.
Don't be idiots. The FBI may say that whether or not they can get in:
1. If they can get in, now people - including high-value targets like journalists - will use bad security.
2. If the FBI (or another agency) has an unknown capability, the FBI must say they can't get in or reveal their capabilities to all adversaries, including to even higher-profile targets such as counter-intelligence targets. Saying nothing also risks revealing the capability.
3. Similarly if Apple helped them, Apple might insist that is not revealed. The same applies to any third party with the capability. (Also, less significantly, saying they can't get in puts more pressure on Apple and on creating backdoors, even if HN readers will see it the other way.)
Also, the target might think they are safe, which could be a tactical advantage. It also may exclude recovered data from rules of handling evidence, even if it's unusable in court. And at best they haven't got in yet - there may be an exploit to this OS version someday, and the FBI can try again then.
It sounds like almost all of our devices have security by annoyance as default. Where are the promises of E2E encryption and all the privacy measures? When I turned on lockdown mode on my iPhone, there were a few notifications where the random spam calls I get were attempting a FaceTime exploit. How come we have to wait until someone can prove ICE can't get into our devices?
I trust 404 media more than most sources, but I can’t help but reflexively read every story prominently showcasing the FBI’s supposed surveillance gaps as attempted watering hole attacks. The NSA almost certainly has hardware backdoors in Apple silicon, as disclosed a couple of years ago by the excellent researchers at Kaspersky. That being the case, Lockdown Mode is not even in play.
Even a parallel construction has limited uses, since you can't use the same excuse every time. The NSA probably doesn't trust the FBI to come up with something plausible.
I use the Cryptomator app for this, it works as advertised. I keep ~60 GiB of personal files in there that would be an easy button to steal my identity and savings. I'm just hoping it doesn't include an NSA back door.
Even if I had the skills to confirm the code is secure, how could I know that this is the code running on my phone, without also having the skills to build and deploy it from source?
Every time something like this happens I assume it is a covert marketing campaign.
If the government wants to get in they’re going to get in. They can also hold you in contempt until you do.
Don’t get me wrong, it’s a good thing that law enforcement cant easily access this on their own. Just feels like the government is working with Apple here to help move some phones.
Better to be held in contempt than to give up constitutional rights under pressure - most functioning democracies have and defend the right to free press, protecting said press sources, and can't make you incriminate yourself.
Anyway, it's a good thing to be skeptical about claims that iphones can't be hacked by government agencies, as long as it doesn't mean you're driven to dodgier parties (as those are guaranteed honeypots).
"Government propaganda to help one of the richest companies in the history of the world sell 0.000000001% more phones this quarter" is quite frankly just idiotic.
You only said half the sentence anyway. The full sentence is: "If the government wants to get in they're going to get in, unless they want to utilize the courts in any way, in which case they have to do things the right way."
If this reporter was a terrorist in Yemen they would have just hacked her phone and/or blown up her apartment. Or even if they simply wanted to knock off her source they probably could have hacked it or gotten the information in some other illicit fashion. But that's not what is happening here.
Remember...they can make you use touch id...they can't make you give them your password.
https://x.com/runasand/status/2017659019251343763?s=20
The FBI was able to access Washington Post reporter Hannah Natanson's Signal messages because she used Signal on her work laptop. The laptop accepted Touch ID for authentication, meaning the agents were allowed to require her to unlock it.
They can hold you in contempt for 18 months for not giving your password, https://arstechnica.com/tech-policy/2020/02/man-who-refused-....
Being held in contempt at least means you got a day in court first. A judge telling me to give up my password is different than a dozen armed, masked secret police telling me to.
That's a very unusual and narrow exception involving "foregone conclusion doctrine", an important fact missed by Ars Technica but elaborated on by AP: https://apnews.com/general-news-49da3a1e71f74e1c98012611aedc...
Link which doesn't directly support website owned by unscrupulous trillionaire: https://xcancel.com/runasand/status/2017659019251343763?s=20
Good reminder to also set up something that does this automatically for you:
https://news.ycombinator.com/item?id=46526010
I actually think it is fitting to read about a government agency weaponized by an unscrupulous billionaire going after journalists working for an unscrupulous billionaire on an unscrupulous trillionaire owned platform.
he probably hates me for my religion but I think he has been net positive for US and world?
Even if his total contribution is positive, his current contribution is quite bad. And most of that bad has been tied directly to x.
I can atleast still voice against Israeli genocide there. I am good for now.
How many people do you think see those tweets, how many minds do you think you have changed, and at what mental cost to yourself?
I see other's tweets. I don't think most are being shadowbanned. I am doing fine myself and pretty productive actually.
What's the point of these questions? Seems like, "what's the point of dissent if the cards are stacked against you?"
He was begging to go party with someone that spent time in prison for child exploitation.
That in itself should make you hate the dude.
Yup. Hate him as person. But he is still net positive with his scientific/engineering contributions, is he not?
Wasn't Edison an asshole?
Dunno, I'd rather have unabused kids than the technological breakthroughs he has contributed to. Anyone being giddy to meet with a convicted pedo is very sus in my books, and deserves no respect, regardless of their prior contributions.
Children were exploited, and we're doing this net positive analysis on whether he should face the scorn. I'm not having a go at you - it's just frustrating to see very little happening after so much has been exposed, and I think part of it comes from this mindset - 'oh he's a good guy, this is a mistake/misstep' while people that were exploited as children can't even get their justice.
It's sickening.
Maybe. I don't think we yet have a good understanding of how many deaths he will have caused as a result of DOGE so abruptly cutting off assistance to so many vulnerable people around the world, but I've heard estimates hover around 600,000.
Assuming that number turns out to be close to reality, how do you weigh so many unnecessary deaths against VTL rockets and the electric cars?
Perhaps a practitioner of Effective Altruism could better answer that question.
I have FIRST-HAND seen corruption around USAID-style "assistance" back home. I fully support that work of his.
How so?
nasa is fucked up. spacex is US’s only shot.
Remember that our rights aren't laws of nature. They have to be fought for to be respected by the government.
Is the knowledge of which finger to use protected as much as a passcode? Law enforcement might have the authority to physically hold the owner's finger to the device, but it seems that the owner has the right to refuse to disclose which finger is the right one. If law enforcement doesn't guess correctly in a few tries, the device could lock itself and require the passcode.
Another reason to use my dog's nose instead of a fingerprint.
There's only ten possible guesses, and most people use their thumb and/or index finger, leaving four much likelier guesses.
Also, IANAL, but I'm pretty sure that if law enforcement has a warrant to seize property from you, they're not obligated to do so immediately the instant they see you - they could have someone follow you and watch to see how you unlock your phone before seizing it.
I really wish Apple would offer a pin option on macos. For this reason, precisely. Either that, or an option to automatically disable touchid after a short amount of time (eg an hour or if my phone doesn't connect to the laptop)
You can setup a separated account with a long password on MacOS and remove your user account from accounts that can unlock FileVault. Then you can change your account to use a short password. You can also change various settings regarding how long Mac has to sleep before requiring to unlock FileVault.
I didn’t understand how a user that cannot unlock FileVault helps. Can you please elaborate on this setup? Thanks.
As another alternative, rather than using Touch ID you can setup a Yubikey or similar hardware key for login to macOS. Then your login does indeed become a PIN with 3 tries before lockout. That plus a complex password is pretty convenient but not biometric. It's what I've done for a long time on my desktop devices.
On my Macbook Pro, I usually need to use both touch and a password but that might be only when some hours have passed between log ins.
0.1 in itself is a very good odd, and 0.1 * n tries is even more laughable. Also most people have two fingers touchID, which makes this number close to half in reality.
I previously commented a solution to another problem, but it assists here too:
https://news.ycombinator.com/item?id=44746992
This command will make your MacBook hibernate when lid is closed or the laptop sleeps, so RAM is written to disk and the system powers down. The downside is that it does increase the amount of time it takes to resume.
A nice side benefit though, is that fingerprint is not accepted on first unlock, I believe secrets are still encrypted at this stage similar to cold boot. A fingerprint still unlocks from screensaver normally, as long as the system does not sleep (and therefore hibernate)
Allowed to require - very mildly constructed sentence, which could include torture or force abuse...
https://xkcd.com/538/
Reminder that you can press the iPhone power button five times to require passcode for the next unlock.
Or squeeze the power and volume buttons for a couple of seconds. It’s good to practice both these gestures so that they become reflex, rather than trying to remember them when they’re needed.
Sad, neither of those works on Android. Pressing the power button activates the emergency call screen with a countdown to call emergency services, and power + volume either just takes a screenshot or enables vibrations/haptics depending on which volume button you press.
On Pixel phones, Power + Volume Up retrieves a menu where you can select "Lockdown".
Not on my Pixel phone, that just sets it to vibrate instead of ring. Holding down the power button retrieves a menu where you can select "Lockdown".
Oh wow, just going into the "should I shutdown" menu also goes into pre-boot lock state? I didn't know that.
It doesn't reenter a BFU state, but it requires a passcode for the next unlock.
It's close enough, because (most of) the encryption keys are wiped from memory every time the device is locked, and this action makes the secure enclave require PIN authentication to release them again.
Did you know that on most models of iPhone, saying "Hey Siri, who's iPhone is this?" will disable biometric authentication until the passcode is entered?
Serious question. If I am re-entering the US after traveling abroad, can customs legally ask me to turn the phone back on and/or seize my phone? I am a US citizen.
Out of habit, I keep my phone off during the flight and turn it on after clearing customs.
my understanding is that they can hold you for a couple days without charges for your insubordination but as a citizen they have to let you back into the country or officially arrest you, try to get an actual warrant, etc.
If you are a US citizen, you legally cannot be denied re-entry into the country for any reason, including not unlocking your phone. They can make it really annoying and detain you for a while, though.
Alternately, hold the power button and either volume button together for a few seconds.
This is the third person advocating button squeezing, as a reminder: IF a gun is on you the jig is up, you can be shot for resisting or reaching for a potential weapon. Wireless detonators do exist, don't f around please.
In case anyone is wondering: In newer versions of MacOS, the user must log out to require a password. Locking screen no longer requires password if Touch ID is enabled.
Settings -> lock screen -> “Require password after screen saver begins or display is turned off”
Is that actually true? I'm fairly confident my work Mac requires a password if it's idle more than a few days (typically over the weekend).
Shift+Option+Command+Q is your fastest route there, but unsaved work will block.
As if the government is not above breaking the law and using rubber hose decryption. The current administration’s justice department has been caught lying left and right
I find it so frustrating that Lockdown Mode is so all-or-nothing.
I want some of the lockdown stuff (No facetime and message attachments from strangers, no link previews, no device connections), but like half of the other ones I don't want.
Why can't I just toggle an iMessage setting for "no link preview, no attachments", or a general setting for "no automatic device connection to untrusted computers while locked"? Why can't I turn off "random dickpicks from strangers on iMessage" without also turning off my browser's javascript JIT and a bunch of other random crap?
Sure, leave the "Lockdown mode" toggle so people who just want "give me all the security" can get it, but split out individual options too.
Just to go through the features I don't want:
* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more
* Shared photo albums - I'm okay viewing shared photo albums from friends, but lockdown mode prevents you from even viewing them
* Configuration profiles - I need this to install custom fonts
Apple's refusal to split out more granular options here hurts my security.
The profiles language may be confusing -- what you can't do is change them while in Lockdown mode.
Family albums work with lockdown mode. You can also disable web restrictions per app and website.
>* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more
This feature has the benefit of teaching users (correctly) that browsing the internet on a phone has always been a terrible idea.
I think that ship has sailed.
"Lockdown Mode is a sometimes overlooked feature of Apple devices that broadly make[sic] them harder to hack."
Funny to see disabling "features" itself described as "feature"
Why not call it a "setting"
Most iPhone users do not change default settings. That's why Google pays Apple billions of dollars for a default setting that sends data about users to Google
"Lockdown Mode" is not a default setting
The phrase "sometimes overlooked" is an understatement. It's not a default setting and almost no one uses it
If it is true Lockdown Mode makes iPhones "harder to hack", as the journalist contends, then it is also true that Apple's default settings make iPhones "easier to hack"
Sadly, they still got to her Signal on her Desktop – her sources might still be compromised. It's sadly inherent to desktop applications, but I'm sad that a lot more people don't know that Signal for Desktop is much, much less secure against adversaries with your laptop.
> I'm sad that a lot more people don't know that Signal for Desktop is much, much less secure against adversaries with your laptop
Educate us. What makes it less secure?
The key in the desktop version is not always stored in the secure enclave, is my assumption (it definitely supports plaintext storage). Theoretically this makes it possible to extract the key for the message database. Also a different malicious program can read it. But this is moot anyway if the FBI can browse through the chats. This isn't what failed here.
I would have thought reporters with confidential sources at that level would already exercise basic security hygiene. Hopefully, this incident is a wake up call for the rest.
If people don't have Signal set to delete sensitive messages quickly, then they may as well just be texting.
That's a strong statement. Also imho it's important that we use Signal for normal stuff like discussing where to get coffee tomorrow - no need for disappearing messages there.
Not if you're using Signal for life-and-death secure messaging; in that scenario it's table stakes.
I'm weird, i even have disappearing messages for my coffee chats. It's kind of refreshing not having any history.
I'm an inbox zero person... I keep even my personal notes to disappear after 2 days. For conversations 1 day.
Yea, I also would want to question the conclusions in the article. Was the issue that they couldn't unlock the iPhone, or that they had no reason to pursue the thread? To my understanding, the Apple ecosystem means that everything is synced together. If they already got into her laptop, wouldn't all of the iMessages, call history, and iCloud material already be synced there? What would be the gain of going after the phone, other than to make the case slightly more watertight?
Did she have Bitlocker or FileVault or other disk encryption that was breeched? (Or they took the system booted as TLAs seek to do?)
There was a story here the other day, bitlocker keys stored in your Microsoft account will be handed over.
breached
Is there an implication here that they could get into an iPhone with lower security settings enabled? There's Advanced Data Protection, which E2EEs more of your data in iCloud. There's the FaceID unlock state, which US law enforcement can compel you to unlock; but penta-click the power button and you go into PIN unlock state, which they cannot compel you to unlock.
My understanding of Lockdown Mode was that it babyifies the device to reduce the attack surface against unknown zero-days. Does the government saying that Lockdown Mode barred them from entering imply that they've got an unknown zero-day that would work in the PIN-unlock state, but not Lockdown Mode?
Yes
> Natanson said she does not use biometrics for her devices, but after investigators told her to try, “when she applied her index finger to the fingerprint reader, the laptop unlocked.”
Curious.
Probably enabled it at some point and forgot. Perhaps even during setup when the computer was new.
My recollection is the computers do by default ask the user to set up biometrics
I want to say that is generous of her, but one thing that is weird is if I didn’t want someone to go into my laptop and they tried to force me to use my fingerprint to unlock it, I definitely wouldn’t use the finger I use to unlock it on the first try. Hopefully, Apple locks it out and forces a password if you use the wrong finger “accidentally” a couple of times.
Correct. That’s why my Touch ID isn’t configured to use the obvious finger.
Very much so, because the question is... did she set it up in the past?
How did it know the print even?
Why is this curious?
There appear to be a relatively few possibilities.
* The reporter lied.
* The reporter forgot.
* Apple devices share fingerprint matching details and another device had her details (this is supposed to be impossible, and I have no reason to believe it isn't).
* The government hacked the computer such that it would unlock this way (probably impossible as well).
* The fingerprint security is much worse than years of evidence suggests.
Mainly it was buried at the very end of the article, and I thought it worth mentioning here in case people missed it.
My opinion is that she set it up, it didn't work at first, she didn't use it, forgot that it existed, and here we are.
> Apple devices share fingerprint matching details and another device had her details
I looked into it quite seriously for windows thinkpads, unless Apple do it differently, you cannot share fingerprint, they're in a local chip and never move.
The reporter lying or forgetting seems to be the clear answer, there's really no reason to believe it's not one of those. And the distinction between the two isn't really important from a technical perspective.
Fingerprint security being poor is also unlikely, because that would only apply if a different finger had been registered.
She has to have set it up before. There is no way to divine a fingerprint any other way. I guess the only other way would be a faulty fingerprint sensor but that should default to a non-entry.
> faulty fingerprint sensor
The fingerprint sensor does not make access control decisions, so the fault would have to be somewhere else (e.g. the software code branch structure that decides what to do with the response from the secure enclave).
Could be a parallel construction type thing. They already have access but they need to document a legal action by which they could have acquired it so it doesn't get thrown out of court.
I think this is pretty unlikely here but it's within the realm of possibility.
Seems like it would be hard to fake. The was she tells it she put her finger on the pad and the OS unlocked the account. Sounds very difficult to do
I think they mean if they already have her fingerprint from somewhere else, and a secret backdoor into the laptop. Then they could login, setup biometrics and pretend they had first access when she unlocked it. All without revealing their backdoor.
https://archive.is/1ILVS
It seems unfortunate that enhanced protection against physically attached devices requires enabling a mode that is much broader, and sounds like it has a noticeable impact on device functionality.
I never attach my iPhone to anything that's not a power source. I would totally enable an "enhanced protection for external accessories" mode. But I'm not going to enable a general "Lockdown mode" that Apple tells me means my "device won’t function like it typically does"
There is a setting as of iOS 26 under "Privacy & Security > Wired Accessories" in which you can make data connections always prompt for access. Not that there haven't been bypasses for this before, but perhaps still of interest to you.
> I would totally enable an "enhanced protection for external accessories" mode.
Anyone can do this for over a decade now, and it's fairly straightforward:
- 2014: https://www.zdziarski.com/blog/?p=2589
- recent: https://reincubate.com/support/how-to/pair-lock-supervise-ip...
This goes beyond the "wired accessories" toggle.
GrapheneOS does this by default - only power delivery when locked. Also it's a hardware block, not software. Seems to be completely immune to these USB exploit tools.
It also has various options to adjust the behaviour, from no blocks at all, to not even being able to charge the phone (or use the phone to charge something else) -- even when unlocked. Changing the mode of operation requires the device PIN, just as changing the device PIN does.
Note that it behaves subtly differently to how you described in case it was connected to something before being locked. In that case data access will remain -- even though the phone is now locked -- until the device is disconnected.
> it has a noticeable impact on device functionality.
The lack of optional granularity on security settings is super frustrating because it leads to many users just opting out of any heightened security.
It isn’t. Settings > Privacy & Security > Wired Accessories
Set to ask for new accessories or always ask.
I have to warn you, it does get annoying when you plug in your power-only cable and it still nags you with the question. But it does work as intended!
You might want to check that charger. I have the same option set to ask every time and it never appears for chargers.
Computer security is generally inversely proportional to convenience. Best opsec is generally to have multiple devices.
> I never attach my iPhone to anything that's not a power source.
It's "attached" to the wifi and to the cell network. Pretty much the same thing.
Can't they just use Pegasus or Cellebrite???
Can a hacked phone (such as one that was not in Lockdown Mode at one point in time) persist in a hacked state?
Obviously, the theoretical answer is yes, given an advanced-enough exploit. But let's say Apple is unaware of a specific rootkit. If each OS update is a wave, is the installed exploit more like a rowboat or a frigate? Will it likely be defeated accidentally by minor OS changes, or is it likely to endure?
This answer is actionable. If exploits are rowboats, installing developer OS betas might be security-enhancing: the exploit might break before the exploiters have a chance to update it.
Forget OS updates. The biggest obstacle to exploit persistence: a good old hard system reboot.
Modern iOS has an incredibly tight secure chain-of-trust bootloader. If you shut your device to a known-off state (using the hardware key sequence), on power on, you can be 99.999% certain only Apple-signed code will run all the way from secureROM to iOS userland. The exception is if the secureROM is somehow compromised and exploited remotely (this requires hardware access at boot-time so I don't buy it).
So, on a fresh boot, you are almost definitely running authentic Apple code. The easiest path to a form of persistence is reusing whatever vector initially pwned you (malicious attachment, website, etc) and being clever in placing it somewhere iOS will attempt to read it again on boot (and so automatically get pwned again).
But honestly, exploiting modern iOS is already difficult enough (exploits go for tens millions $USD), persistence is an order of magnitude more difficult.
It's why I keep my old iPhone XR on 15.x for jail breaking reasons. I purchased an a new phone specially for the later versions and online banking.
Apple bought out all the jail breakers as Denuvo did for the game crackers.
> Apple bought out all the jail breakers > Denuvo did for the game crackers
Do you have sources for these statements?
Secure boot and verified system partition is supposed to help with that. It's for the same reason jailbreaks don't persist across reboots these days.
You should read into IOS internals before commenting stuff like this. Your answer is wrong, and rootkits have been dead on most OS's for years, but ESPECIALLY IOS. Not every OS is like Linux where security is second.
Even a cursory glance would show it's literally impossible on IOS with even a basic understanding.
Re: reboots – TFA states that recent iPhones reboot every 3 days when inactive for the same reasons. Of course, now that we know that it's linked to inactivity, black hatters will know how to avoid it...
What is she investigated for?
They're not actually investigating her, they're investigating a source that leaked her classified materials.
If they're not investigating her she doesn't have any 5th-amendment protection and can be compelled to testify on anything of relevance, including how to unlock her devices.
Did the individual store the classified material in the bathroom at his beach-side resort?
I guess they got a 404
We need a Lockdown mode for MacBooks as well!
Looks like it’s a feature: https://support.apple.com/en-us/105120
To save a click:
* Lockdown Mode needs to be turned on separately for your iPhone, iPad, and Mac.
* When you turn on Lockdown Mode for your iPhone, it's automatically turned on for your paired Apple Watch.
* When you turn on Lockdown Mode for one of your devices, you get prompts to turn it on for your other supported Apple devices.
Little too late for 1000 people hacked by pegasus.
Don't be idiots. The FBI may say that whether or not they can get in:
1. If they can get in, now people - including high-value targets like journalists - will use bad security.
2. If the FBI (or another agency) has an unknown capability, the FBI must say they can't get in or reveal their capabilities to all adversaries, including to even higher-profile targets such as counter-intelligence targets. Saying nothing also risks revealing the capability.
3. Similarly if Apple helped them, Apple might insist that is not revealed. The same applies to any third party with the capability. (Also, less significantly, saying they can't get in puts more pressure on Apple and on creating backdoors, even if HN readers will see it the other way.)
Also, the target might think they are safe, which could be a tactical advantage. It also may exclude recovered data from rules of handling evidence, even if it's unusable in court. And at best they haven't got in yet - there may be an exploit to this OS version someday, and the FBI can try again then.
It sounds like almost all of our devices have security by annoyance as default. Where are the promises of E2E encryption and all the privacy measures? When I turned on lockdown mode on my iPhone, there were a few notifications where the random spam calls I get were attempting a FaceTime exploit. How come we have to wait until someone can prove ICE can't get into our devices?
Previously, direct link to the court doc:
FBI unable to extract data from iPhone 13 in Lockdown Mode in high profile case [pdf]
https://storage.courtlistener.com/recap/gov.uscourts.vaed.58...
(https://news.ycombinator.com/item?id=46843967)
I trust 404 media more than most sources, but I can’t help but reflexively read every story prominently showcasing the FBI’s supposed surveillance gaps as attempted watering hole attacks. The NSA almost certainly has hardware backdoors in Apple silicon, as disclosed a couple of years ago by the excellent researchers at Kaspersky. That being the case, Lockdown Mode is not even in play.
The NSA is not going to tip its hand about any backdoors it had built into the hardware for something as small as this.
It depends on if parallel reconstruction can be used to provide deniability.
Even a parallel construction has limited uses, since you can't use the same excuse every time. The NSA probably doesn't trust the FBI to come up with something plausible.
Samsung phones have the Secure Folder which can have a different, more secure password and be encrypted when the phone is on.
Secure folder uses or is in the process of starting to use Android native feature private space, which is available on all Android 15 phones.
I use the Cryptomator app for this, it works as advertised. I keep ~60 GiB of personal files in there that would be an easy button to steal my identity and savings. I'm just hoping it doesn't include an NSA back door.
you can check the github https://github.com/cryptomator/ios
Even if I had the skills to confirm the code is secure, how could I know that this is the code running on my phone, without also having the skills to build and deploy it from source?
Also, you need to make sure that the installation process does not insert a backdoor into the code you built from source.
For now! They’ll get something from open market like the last time when Apple refused to decrypt (or unlock?) a phone for them.
Yeah this is low stakes stuff, Pegasus historically breaks Apple phones easy. Bezos's nudes and Khashoggi knows. (not really Khashoggi is dead)
They just need to ask apple to unlock it. And they can't really refuse under US law
They can refuse, and they have refused. See San Bernardino and the concept of "compelled work".
Every time something like this happens I assume it is a covert marketing campaign.
If the government wants to get in they’re going to get in. They can also hold you in contempt until you do.
Don’t get me wrong, it’s a good thing that law enforcement cant easily access this on their own. Just feels like the government is working with Apple here to help move some phones.
Better to be held in contempt than to give up constitutional rights under pressure - most functioning democracies have and defend the right to free press, protecting said press sources, and can't make you incriminate yourself.
Anyway, it's a good thing to be skeptical about claims that iphones can't be hacked by government agencies, as long as it doesn't mean you're driven to dodgier parties (as those are guaranteed honeypots).
"Government propaganda to help one of the richest companies in the history of the world sell 0.000000001% more phones this quarter" is quite frankly just idiotic.
You only said half the sentence anyway. The full sentence is: "If the government wants to get in they're going to get in, unless they want to utilize the courts in any way, in which case they have to do things the right way."
If this reporter was a terrorist in Yemen they would have just hacked her phone and/or blown up her apartment. Or even if they simply wanted to knock off her source they probably could have hacked it or gotten the information in some other illicit fashion. But that's not what is happening here.