Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can always count on someone coming along and defending the multi-trillion dollar corporation that just so happens to take a screenshot of your screen every few seconds (among many, many - too many other things)




I big demographic of HN users are people who want to be the multi-trillion dollar corporation so it’s not too surprising. In this case though I think they are right. And I’m a big time Microsoft hater.

The defenders of Microsoft are right?

How?

There is no point locking your laptop with a passphrase if that passphrase is thrown around.

Sure, maybe some thief can't get access, but they probably can if they can convince Microsoft to hand over the key.

Microsoft should not have the key, thats part of the whole point of FDE; nobody can access your drive except you.

The cost of this is that if you lose your key: you also lose the data.

We have trained users about this for a decade, there have been countless dialogues explaining this, even if we were dumber than we were (we're not, despite what we're being told: users just have fatigue from over stimulation due to shitty UX everywhere); then it's still a bad default.


Just to be clear: bitlocker is NOT encrypting with your login password! I could be a little fuzzy on the details but I believe how it works is that your TPM (Trusted Platform Module) is able to decrypt your laptop, but will only do so if there is a fully signed and trusted boot chain, so if somebody gains access to your laptop and attempts to boot into anything other than Windows, it will ask for the bitlocker key because the TPM won't play ball.

The important bit here is that ~*nobody* who is using Windows cares about encryption or even knows what it is! This is all on by default, which is a good thing, but also means that yes, of course Microsoft has to store the keys, because otherwise a regular user will happen to mess around with their bios one day and accidentally lock themselves permanently out of their computer.

If you want regular FDE without giving Microsoft the key you can go ahead and do it fairly easily! But realistically if the people in these cases were using Linux or something instead the police wouldn't have needed an encryption key because they would never have encrypted their laptop in the first place.


> nobody who is using Windows cares about encryption or even knows what it is!

Right, so the solution is to silently upload their encryption keys to Microsoft's servers without telling them? If users don't understand encryption, they certainly don't understand they've just handed their keys to a third party subject to government data requests.

> otherwise a regular user will happen to mess around with their bios one day and accidentally lock themselves permanently out of their computer.

This is such transparent fear-mongering. How often does this actually happen versus how often are cloud providers breached or served with legal requests? You're solving a hypothetical edge case by creating an actual security vulnerability.

Encryption by default and cloud key escrow are separate decisions. You can have one without the other. The fact that Microsoft chose both doesn't make the second one necessary, it makes it convenient for Microsoft.

> If you want regular FDE without giving Microsoft the key you can go ahead and do it fairly easily!

Then why isn't that the default with cloud backup as opt-in? Oh right, because then Microsoft wouldn't have everyone's keys.


> Right, so the solution is to silently upload their encryption keys to Microsoft's servers without telling them? If users don't understand encryption, they certainly don't understand they've just handed their keys to a third party subject to government data requests.

What exactly are you hoping Windows does here? Anyone who knows anything about Bitlocker knows Microsoft has the keys (that's where you get the key when you need it, which I have needed it many times because I dual boot!) Microsoft could put a big screen on install saying 'we have your encryption keys!' — would this change literally anything? They would need to also explain what that means and what bitlocker is. And then after all of that, the only people who are going to decide 'actually I want to set up FDE myself' are going to be the technical people who already knew all of this already! This is just a non-issue.

> This is such transparent fear-mongering. How often does this actually happen versus how often are cloud providers breached or served with legal requests? You're solving a hypothetical edge case by creating an actual security vulnerability.

This is not fear mongering at all! The nice thing about Bitlocker is that you don't need to put in your key 99% of the time (and in fact 99% of Windows users — who are not technical! — don't even know they have Bitlocker). But occasionally you do need to put it in. Once or twice I've booted to the bitlocker screen and I actually don't even know why. Maybe my TPM got wiped somehow? Maybe my computer shut down in a really weird way? But it happens enough that it's clearly necessary! That big Crowdstrike screwup a year ago; one of the ways to fix it required having your Bitlocker key!

> Encryption by default and cloud key escrow are separate decisions. You can have one without the other. The fact that Microsoft chose both doesn't make the second one necessary, it makes it convenient for Microsoft.

Again, this is not true for a product like Windows where 99% of users are not technical. Remember, Bitlocker does not require your key on startup the vast majority the time! However, there is a chance that you will need the key at some point or you will be locked out of you data permanently. Where should Microsoft give the user the key? Should they say on install 'hey, write this down and don't lose it!' Any solution relying on the user is obviously a recipe for disaster. But again, let me remind you that encryption by default is important because you don't want any old random laptop thief to get access to your chrome account! So yes, I think Microsoft made the best and only choice here.


BitLocker encrypts data on a disk using what it calls a Full Volume Encryption Key (FVEK).[1][2] This FVEK is encrypted with a separate key which it calls a Volume Management Key (VMK) and the VMK-encrypted FVEK is stored in one to three (for redundancy) metadata blocks on the disk.[1][2] The VMK is then encrypted with one or more times with a key which is derived/stored using one or more methods which are identified with VolumeKeyProtectorID.[2][3] These methods include what I think would now be the default for modern Windows installations of 3 "Numerical password" (128-bit recovery key formatted with checksums) and 4 "TPM And PIN". Previously instead of 4 "TPM And PIN" most Windows installations (without TPMs forced to be used) would probably be using just 8 "Passphrase". Unless things have changed recently, in mode 4 "TPM And PIN", the TPM stores a partial key, and the PIN supplied by the user is the other partial key, and both partial keys are combined together to produce the key used to decrypt the VMK.

Seemingly once you've installed Windows and given the Microsoft your BitLocker keys in escrow, you could then use Remove-BitLockerKeyProtector to delete the VMK which is protected with mode 3 "Numerical password" (recovery key).[4] It appears that the escrow process (possibly the same as used by BackupToAAD-BitLockerKeyProtector) might only send the numerical key, rather than the VMK itself.[5][6] I couldn't find from a quick Internet search someone who has reverse engineered fveskybackup.dll to confirm this is the case though. If Microsoft are sending the VMK _and_ the numerical key, then they have everything needed to decrypt a disk. If Microsoft are only sending the numerical key, and all numerical key protected VMKs are later securely erased from the disk, the numerical key they hold in escrow wouldn't be useful later on.

Someone did however ask the same question I first had. What if I had, for example, a billion BitLocker recovery keys I wanted to ensure were backed up for my protection, safety and peace of mind? This curious person did however already know the limit was 200 recovery keys per device, and found out re-encryption would fail if this limit had been reached, then realised Microsoft had fixed this bug by adding a mechanism to automatically delete stale recovery keys in escrow, then reverse engineered fveskybackup.dll and an undocumented Microsoft Graph API call used to delete (or "delete") escrowed BitLocker recovery keys in batches of 16.[7]

It also appears you might only be able to encrypt 10000 disks per day or change your mind on your disk's BitLocker recovery keys 10000 times per day.[8] That might sound like a lot for particularly an individual, but the API also perhaps applies a limit of 150 disks being encrypted every 15 minutes for an entire organisation/tenancy. It doesn't look like anyone has written up an investigation into the limits that might apply for personal Microsoft accounts, or if limits differ if the MS-Organization-Access certificate is presented, or what happens to a Windows installation if a limit is encountered (does it skip BitLocker and continue the installation with it disabled?).

[1] https://learn.microsoft.com/en-us/purview/office-365-bitlock...

[2] https://itm4n.github.io/tpm-based-bitlocker/

[3] https://learn.microsoft.com/en-us/windows/win32/secprov/getk...

[4] https://learn.microsoft.com/en-us/powershell/module/bitlocke...

[5] https://learn.microsoft.com/en-us/graph/api/bitlockerrecover...

[6] https://learn.microsoft.com/en-us/powershell/module/bitlocke...

[7] https://patchmypc.com/blog/bitlocker-recovery-key-cleanup/

[8] https://learn.microsoft.com/en-us/graph/throttling-limits#in...


The vast, vast majority of Windows users don't know their laptops are encrypted, don't understand encryption, and don't know what bitlocker is. If their keys weren't stored in the cloud, these users could easily lose access to their data without understanding how or why. So for these users, which again is probably >99% of all windows users, storing their keys in the cloud makes sense and is a reasonable default. Not doing it would cause far more problems than it solves.

And the passphrase they log in to windows with is not the key, Microsoft is not storing their plain text passphrase in the cloud, just to be clear.

The only thing I would really fault Microsoft for here is making it overly difficult to disable the cloud storage for users who do understand all the implications.


> The vast, vast majority of Windows users don't know their laptops are encrypted, don't understand encryption, and don't know what bitlocker is.

Mate, if 99% of users don't understand encryption, they also don't understand that Microsoft now has their keys. You can't simultaneously argue that users are too thick to manage keys but savvy enough to consent to uploading them.

> If their keys weren't stored in the cloud, these users could easily lose access to their data without understanding how or why.

As opposed to losing access when Microsoft gets breached, or when law enforcement requests their keys, or when Microsoft decides to lock them out? You've traded one risk for several others, except now users have zero control.

The solution to "users might lock themselves out" is better UX for local key backup, not "upload everyone's keys to our servers by default and bury the opt-out". One is a design problem, the other is a business decision masquerading as user protection.

> The only thing I would really fault Microsoft for here is making it overly difficult to disable the cloud storage for users who do understand all the implications.

That's not a bug, it's the entire point. If it were easy to disable, people who understand the implications would disable it. Can't have that, can we?


This happens everywhere. There is a reason there are memes about people defending multi-billion dollar corporations.

Sorry to interrupt the daily rage session with some neutral facts about how Windows and the law work.

> that just so happens to take a screenshot of your screen every few seconds

Recall is off by default. You have to go turn it on if you want it.


It only became off by default after those "daily rage sessions" created sufficient public pressure to turn them off.

Microsoft also happens to own LinkedIn which conveniently "forgets" all of my privacy settings every time I decide to review them (about once a year) and discover that they had been toggled back to the privacy-invasive value without my knowledge. This has happened several times over the years.


> It only became off by default after those "daily rage sessions" created sufficient public pressure to turn them off.

99% of the daily rage sessions happened before it was even released


Preventive care is better.

Daily rage is exactly what technology affine people need to direct at Microslop, while helping their loved ones and ideally businesses transition away from the vendor lockin onto free software.

https://en.wikipedia.org/wiki/Room_641A ... Then, years later every one acts like Snowden had some big reveal.

There is the old password for candy bar study: https://blog.tmb.co.uk/passwords-for-chocolate

Do users care? I would posit that the bulk of them do not, because they just dont see how it applies to them, till they run into some type of problem.


Are you referring to Microsoft Recall? My understanding is that is opt-in and only stored locally.

Stored locally.. until it's uploaded by OneDrive or Windows Backup?

1) for now

2) according to Microsoft

So, trust is not zero. It's deeply negative.


AI enshittification is irrelevant here. Why is someone pointing out that sensible secure defaults are a good thing suddenly defending the entire company?

Uploading your encryption keys up to someone else's machine is not a sensible default

It generally is, because in the vast majority of cases users will not keep a local copy and will lose their data.

Most (though not all) users are looking for encryption to protect their data from a thief who steals their laptop and who could extract their passwords, banking info, etc. Not from the government using a warrant in a criminal investigation.

If you're one of the subset of people worried about the government, you're generally not using default options.


For laptops sure, but then those are not reasons for it to be default on desktops too. Are most Windows users on laptops? I highly doubt that. So it is not a sensible default.

Most pc users are using laptops, yes. Above 60%.

Even offices usually give people laptops over desktops so that they can bring it to meetings.


Then don't enable encryption? Basically I cannot rescue the files on my own disk but the police can?

> Basically I cannot rescue the files on my own disk but the police can?

I think you're misunderstanding. You can rescue the files on your own disk when you place the key in your MS account.

There's no scenario where you can't but the police can.


If I happen to know that my key is there.

You'd have to be quite daft not to. The Bitlocker lock out screen has a qr code and a link telling you to go fetch your recovery key.

> It generally is, because in the vast majority of cases users will not keep a local copy and will lose their data.

What's the equivalent of thinking users are this stupid?

I seem to recall that the banks repeatedly tell me not to share my PIN number with anyone, including (and especially) bank staff.

I'm told not to share images of my house keys on the internet, let alone handing them to the government or whathaveyou.

Yet for some unknown reason everyone should send their disk encryption keys to one of the largest companies in the world (largely outside of legal jurisdiction), because they themselves can't be trusted.

Bear in mind that with a(ny) TPM chip, you don't need to remember anything.

Come off it mate. You're having a laugh aren't you?


> What's the equivalent of thinking users are this stupid?

What's the equivalent of thinking security aficionados are clueless?

Security advice is dumb and detached from life, and puts ubdue burden on people that's not like anything else in life.

Sharing passwords is a feature, or rather a workaround because this industry doesn't recognize the concept of temporary delegation of authority, even though it's the basics of everyday life and work. That's what you do when you e.g. send your kid on a grocery run with your credit card.

Asking users to keep their 2FA recovery keys or disk encryption keys safe on their own - that's beyond ridiculous. Nothing else in life works that way. Not your government ID, not your bank account, not your password, not even the nuclear launch codes. Everything people are used to is fixable; there's always a recovery path for losing access to accounts or data. It may take time and might involve paying a notary or a court case, but there is always a way. But not so with encryption keys to your shitposts and vacation pictures in the cloud.

Why would you expect people to follow security advice correctly? It's detached from reality, dumb, and as Bitcoin showed, even having millions of dollars on the line doesn't make regular people capable of being responsible with encryption keys.


Your credit card analogy is doing a lot of heavy lifting here, but it's carrying the wrong cargo. Sending your kid to the shops with your card is temporary delegation, not permanent key escrow to a third party you don't control. It's the difference between lending someone your house key for the weekend and posting a copy to the council "just in case you lose yours". And; you know that you've done it, you have personally weighed the risks and if something happens with your card/key in that window: you can hold them to account. (granted, keys can be copied)

> Nothing else in life works that way. Not your government ID, not your bank account, not your password, not even the nuclear launch codes.

Brilliant examples of why you're wrong:

Government IDs have recovery because the government is the trusted authority that verified you exist in the first place. Microsoft didn't issue your birth certificate.

Nuclear launch codes are literally designed around not giving any single entity complete access, hence the two-person rule and multiple independent key holders. You've just argued for my position.

Banks can reset your PIN because they're heavily regulated entities with legal obligations and actual consequences for breaching trust. Microsoft's legal department is larger than most countries' regulators.

> even having millions of dollars on the line doesn't make regular people capable of being responsible with encryption keys.

Right, so the solution is clearly to hand those keys to a corporation that's subject to government data requests, has been breached multiple times, and whose interests fundamentally don't align with yours? The problem with Bitcoin isn't that keys are hard - it's that the UX is atrocious. The solution is better tooling, not surveillance capitalism with extra steps.

You're not arguing for usability. You're arguing that we should trust a massive corporation more than we trust ourselves, whilst simultaneously claiming users are too thick to keep a recovery key in a drawer. Pick a lane.


Let's be serious for a second and consider what's more useful based on the likelihood of these things actually happening.

You're saying it's likely to happen that a laptop thief also is capable to stealing the recovery key from Microsoft'servers?

So therefore it would be better that users lost all their data if - an update bungles the tpm trust - their laptop dies and they extract the hard drive - they try to install another OS alongside but fuck up the tpm trust along the way - they have to replace a Mainboard - they want to upgrade their pc ?

I know for a fact which has happened to me more often.


You've listed five scenarios where local recovery would help and concluded that cloud escrow is therefore necessary. The thing is every single one of those scenarios is solved by a local backup of your recovery key, not by uploading it to Microsoft's servers.

The question isn't "cloud escrow vs nothing". It's "cloud escrow vs local backup". One protects you from hardware failure. The other protects you from hardware failure whilst also making you vulnerable to data breaches, government requests, and corporate policy changes you have zero control over.

You've solved a technical problem by creating a political one. Great.


> Sending your kid to the shops with your card is temporary delegation, not permanent key escrow to a third party you don't control. It's the difference between lending someone your house key for the weekend and posting a copy to the council "just in case you lose yours".

Okay, then take sharing your PINs with your spouse. Or for that matter, account passwords or phone unlock patterns. It's a perfectly normal thing that many people (including myself) do, because it enables ad-hoc delegation. "Honey, can you copy those photos to my laptop and send them to godparents?", asks my wife as she hands me her phone and runs to help our daughter with something - implicitly trusting me with access to her phone, thumbdrive, Windows account, e-mail account, and WhatsApp/Messenger accounts.

This kind of ad-hoc requests happen for us regularly, in both directions, without giving it much of a thought[0]. It's common between couples, variants of that are also common within family (e.g. grandparents delegating most of computer stuff to their adult kids on an ad-hoc basis), and variants of that also happen regularly in workplaces[1], despite the whole corporate and legal bureaucracy trying its best to prevent it[2].

> Government IDs have recovery because the government is the trusted authority that verified you exist in the first place. Microsoft didn't issue your birth certificate.

But Microsoft issued your copy of Windows and Bitlocker and is the one responsible for your data getting encrypted. It's obvious for people to seek recourse with them. This is how it works in every industry other than tech, which is why I'm a supporter of governments actually regulating in requirements for tech companies to offer proper customer support, and stop with the "screw up managing 2FA recovery keys, lose your account forever" bullshit.

> Banks can reset your PIN because they're heavily regulated entities with legal obligations and actual consequences for breaching trust.

As it should be. As it works everywhere, except tech, and especially except in the minds of security aficionados.

> Nuclear launch codes are literally designed around not giving any single entity complete access, hence the two-person rule and multiple independent key holders.

Point being, if enough right people want the nukes to be launched, the nukes will be launched. This is about the highest degree of responsibility on the planet, and relevant systems do not have the property of "lose the encryption key we told you 5 years ago to write down, and it's mathematically proven that no one can ever access the system anymore". It would be stupid to demand that.

That's the difference between infosec industry and real life: in real life, there is always a way to recover. Infosec is trying to normalize data and access being fundamentally unrecoverable after even a slightest fuckup, which is a degree of risk individuals and society have not internalized yet, and are not equipped to handle.

> Right, so the solution is clearly to hand those keys to a corporation that's subject to government data requests, has been breached multiple times, and whose interests fundamentally don't align with yours?

Yes. For normal people, Microsoft is not a threat actor here. Nor is the government. Microsoft is offering a feature that keeps your data safe from thieves and stalkers (and arguably even organized crime), but that doesn't require you to suddenly treat your laptop with more care than you treat your government ID. They can do this, because for users of this feature, Microsoft is a trusted party.

Ultimately, that's what security aficionados and cryptocurrency people don't get: the world runs on trust. Trust is a feature.

--

[0] - Though less and less of that because everyone and their dog now wants to require 2FA for everything. Instead of getting the hint that passwords are not meant to identify a specific individual, they're doubling down and tying every other operation to a mobile phone, so delegating desktop operations often requires handing over your phone as well, defeating the whole point. This is precisely what I mean by the industry not recognizing or supporting the concept of delegation of authority.

[1] - The infamous practice of writing passwords on post-it notes isn't just because of onerous password requirements, it's also a way to facilitate temporary delegation of authority. "Can you do X for me? Password is on a post-it in the top drawer."

[2] - GDPR or not, I still heard from doctors I know personally that sharing passwords to access patient data is common, and so is bringing some of it back home on a thumb drive, to do some work after hours. On the one hand, this creates some privacy risks for patient (and legal risk for hospitals) - but on the other hand, these doctors don't do it because they hate GDPR or their patients. They do it because it's the only way they can actually do their jobs effectively. If rules were actually enforced to prevent it, people would die. This is what I mean when I say that security advice is often dumb and out of touch with reality, and ignored for very good reasons.


Your entire argument rests on conflating "trust" with "blind dependency on a third party subject to legal compulsion".

> Okay, then take sharing your PINs with your spouse.

Sharing with your spouse is consensual, temporary, and revocable. You know you've done it, you trust that specific person, and you can change it later. Uploading your keys to Microsoft is none of these things.

> But Microsoft issued your copy of Windows and Bitlocker and is the one responsible for your data getting encrypted.

Microsoft sold you software. They didn't verify your identity, they're not a regulated financial institution, and they have no duty of care beyond their terms of service. The fact that they encrypted your drive doesn't make them a trustworthy custodian of the keys any more than your locksmith is entitled to copies of your house keys.

> For normal people, Microsoft is not a threat actor here. Nor is the government.

"Normal people" includes journalists, lawyers, activists, abuse survivors, and anyone else Microsoft might be legally compelled to surveil. Your threat model is "thieves and stalkers". Mine includes the state. Both are valid, but only one of us is forcing our model on everyone by default.

> the world runs on trust. Trust is a feature.

Trust in the wrong entity is a vulnerability. You're arguing we should trust a corporation with a legal department larger than most countries' regulators, one that's repeatedly been breached and is subject to government data requests in every jurisdiction it operates.

Your doctors-breaking-GDPR example is particularly telling: you've observed that bad UX causes people to route around security, and concluded that security is the problem rather than the UX. The solution to "delegation is hard" isn't "give up and trust corporations". It's "build better delegation mechanisms". One is an engineering problem. The other is surrender dressed as pragmatism.


So what happens if your motherboard gets fried and you don’t have backups of your recovery key or your data? TPMs do fail on occasion. A bank PIN you can call and reset, they can already verify your identity through other means.

> So what happens if your motherboard gets fried and you don't have backups of your recovery key or your data?

If you don't have backups of your data, you've already lost regardless of where your recovery key lives. That's not an encryption problem, that's a "you didn't do backups" problem, which, I'll agree is a common issue. I wonder if the largest software company on the planet (with an operating system in practically every home) can help with making that better. Seems like Apple can, weird.

> TPMs do fail on occasion.

So do Microsoft's servers. Except Microsoft's servers are a target worth attacking, whereas your TPM isn't. When was the last time you heard about a targeted nation-state attack on someone's motherboard TPM versus a data breach at a cloud provider?

> A bank PIN you can call and reset, they can already verify your identity through other means.

Banks can do that because they're regulated financial institutions with actual legal obligations and consequences for getting it wrong. They also verified your identity when you opened the account, using government ID and proof of address.

Microsoft is not your bank, not your government, and has no such obligations. When they hand your keys to law enforcement, which they're legally compelled to do, you don't get a phone call asking if that's alright.

The solution to TPM failure is a local backup of your recovery key, stored securely. Not uploading it to someone else's computer and hoping for the best.


> I wonder if the largest software company on the planet (with an operating system in practically every home) can help with making that better. Seems like Apple can, weird.

If you're talking about time machine, windows has had options built in since NT.


If this is the case; then it leans even more into my point.

[flagged]


This is ridiculous.

There are a lot of people here criticising MSFT for implementing a perfectly reasonable encryption scheme.

This isn’t some secret backdoor, but a huge security improvement for end-users. This mechanism is what allows FDE to be on by default, just like (unencrypted) iCloud backups do for Apple users.

Calling bs on people trying to paint this as something it’s not is not “whiteknighting”.


Yes, because object level facts matter, and it's intellectually dishonest to ignore the facts and go straight into analyzing which side is the most righteous, like:

>Microsoft is an evil corporation, so we must take all bad stories about them at face value. You're not some corpo bootlicker, now, are you? Now, in unrelated news, I heard Pfizer, another evil corporation with a dodgy history[1] is insisting their vaccines are safe...

[1] https://en.wikipedia.org/wiki/Pfizer#Legal_issues


Microsoft doesn't take the screenshot; their operating system does if Recall is enabled, and although the screenshots themselves are stored in an insecure format and location, Microsoft doesn't get them by default.

Is that last part even still true? When I played around with it they asked me to store a recovery pass phrase off device in case windows hello breaks



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: