In Qubes you use VMs to separate your banking environment from the one where you pull npm dependencies and the one where you open untrusted PDFs.
Networking also happens in its own VM, and you can have multiple VMs dedicated to networking.
Much lower memory footprint running mirage firewall, and an attack surface orders of magnitude smaller (compared to a VM running a Linux distribution purely for networking).
Did you get a genuine key? I never had one fail on me.
The immobilizer is the single best piece of technology for preventing car theft. If you create a backdoor for bypassing it, you'll end up like Hyundai/Kia which decided to sell cars without the immobilizer in recent years and which have turned into a joke in the minds of potential customers.
It does not require a battery in most cases and is separate from the keeloq system that controls your car's doors.
The Stellantis systems I’ve worked on have a nice feature that there is a battery for the proximity use, that you can keep the key in your pocket and press the button to run the car, as long as the key is within the four or five proximity sensors you are fine.
When that battery dies, you can press the directly to the start button and it uses a “receiver powered transmitter” RFID close proximity to start and run the vehicle.
Most people don’t know this, so when that battery dies they panic and suffer.
This technique of pressing the dead key to the starter button works for quite a lot of brands, not just Stellantis vehicles. Always worth trying if you are in a "keyless" car with a dead key fob battery.
In my experience virtually everything made in last 15 years will either support this RFID backup or have a spare physical key hidden inside the keyless fob.
Lots of them will even let you press the dead key against some part of the exterior to unlock the doors too.
> The messages are mine, not theirs, and yet they refuse to allow me to handle them how I deem fit.
"They refuse to allow me" meaning "they don't add the features I want for free to the app they provide for free, so I complain".
The messages are yours, of course. But don't forget that you use their work for free. If you're not happy, go use the free work of someone else, I guess?
They are somewhat correct though, Signal has written code explicitly to prevent iOS users from including Signal data in Apple’s encrypted local and/or cloud backups.
Allowing encrypted backups was free for Signal, but they spent time and money to prevent it for iOS users.
Part of the code the wrote to prevent backups in question:
Lot's of people have requested justification in related Github issues there, but Signal has not given a clear answer. If there was a security problem with the encryption process I believe a CVE or similar would have been in order because it would affect millions of users.
We are unfortunately rehashing the same arguments from Github, nothing prevents Signal from distrusting Apple by default.
But there is also nothing (except for some secret reason they refuse to elaborate) that prevents them from allowing users to actively chose to trust Apple. Except for their own internal reasons, that is.
It's the user's data after all. The user should be able to control and access it. Sensible defaults makes sense, but the outright refusal to explain why they prevent it is very odd. I have a decent "IT hygiene", I keep my operating system updated with patches, I don't download pirated/cracked software, I have hardware-enabled encryption on my storage devices, I have a good password for my local account, I encrypt my local iPhone backups.
Why should I not be allowed to include my Signal chats in those local backups? Signal has never answered that question, which is very strange.
Same as I said above: you are asking for a new feature. Their default is those 20 lines that "protect" the files. If they want to offer you a way to still enable it, someone has to do it. Someone has to work on the UX of it, maybe there is a need to explain to the users why it is less secure when this feature is enabled, and then there is work to do with the criticisms that will come next time someone shoots themselves in the foot because of this feature (because "Signal shouldn't have allowed that in the first place").
I know, you will say "it's not much". But everybody asks for their "small feature", and projects generally can't do everything that everybody asks them to do (and usually for free).
I find it totally valid if they choose that they won't offer features to lower their security, and instead they will work on features having sufficiently good security. Which in this case is the secure backup.
I think we have vastly different definitions of what is a "new" feature. This is not about adding a new feature, but removing an old bug.
> If they want to offer you a way to still enable it, someone has to do it.
They can just use the iOS system settings to allow users to enable/disable backups. This would be zero code needed. Zero maintainability problems. Zero UX. Zero unexpected data loss for customers. The settings for this is for all sane apps at Manage Storage > Backups > [Device Name] > [App Name].
> I know, you will say "it's not much". But everybody asks for their "small feature"
It's less than anything, it's removing a "feature", which should make things easier to maintain.
Signal _added_ the "feature" to disable the default iOS behaviour that user data can be backed up securely. This caused, in many users life, a bug of unexpected data loss. Signal caused that bug and that data loss by introducing this "feature".
Again, fixing this bug would not require a new feature to be added, but rather an unwanted bug to be removed by removing code needed to maintain it.
> I find it totally valid if they choose that they won't offer features to lower their security, and instead they will work on features having sufficiently good security. Which in this case is the secure backup.
Not a single argument has been given why this would be more secure than the locally encrypted backup you can do yourself in iOS. In fact, it would be sane to suggest that any newly introduced claimed secure system is insecure until tested.
I understand that you are frustrated. And I understand that if you were to write Signal, you would do it differently.
Still, those 20 lines don't look like a bug to me. And Signal does not benefit from pissing you off. I was just trying to say that maybe, just maybe, there is a valid reason behind this.
The bug is not in the detailed implementation of the code logic per se, the bug is that it causes unexpected data loss because iOS users expect all their data to be backed up when they back up all their important data.
As an example, a piece of code sending authentication credentials in plain text across the internet might in isolation be considered free of bugs. But it should never do that to begin with, it should have been designed/architected quite a bit differently.
You are free to carry water for Signal while they repeatedly refuse to even explain why they consider this a valid approach to handle the users data.
"I consider it a bug because I really want this feature" does not change the fact that it is a feature.
> As an example, a piece of code sending authentication credentials in plain text across the internet might in isolation be considered free of bugs.
This is not a good example. It's almost certainly a security issue. Unless you have a threat model where you absolutely don't give a shit about it, but we're not in 2010 anymore. Let me try to make another one:
As an example, a messenging app sending encrypted but not end-to-end encrypted messages over a server may be considered free of bugs. Adding end-to-end encryption to it would be a new feature, and it may well be out of scope for that particular app (ever heard of Telegram?).
Today I learned that some people consider unexpected data loss a feature, and that removing such a "feature" is in fact the same as adding a new feature.
It's newspeak all in the software world. A first for everything I suppose.
> I was saying that maybe, Signal did not want to push their users to trust the Apple backup by default.
The gap in understanding here is that Signal already trusts iOS by providing an app. It trusts it even more by providing notifications (with sender and content) that go through Apple’s systems. It integrates with CallKit to work with the Phone app. Putting iCloud alone in a separate bucket doesn’t make sense. They could’ve done this same backup with a 64 character recovery key and stored the data in iCloud. Signal made an intentional choice not to allow backups on iOS.
One can only hope that the point about supporting other backup endpoints/storage gets implemented sooner rather than having to wait several more years.
> They could’ve done this same backup with a 64 character recovery key and
Again: they could have, but it would have taken time and resources. The complaint here is not that Signal doesn't want to allow backups: they are just announcing a secure backup feature.
The complaint is that Signal did not do it earlier, and instead decided to prevent what they considered an insufficient solution.
> Putting iCloud alone in a separate bucket doesn’t make sense.
Of course it makes sense. What you say is akin to saying "end to end encryption makes no sense, because if you have to trust iOS anyway, you may as well trust the server".
Because I trust Android and run Signal there does not mean that I want it to auto-upload my messages to Google Drive. I don't see what makes it so hard to understand.
> One can only hope that the point about supporting other backup endpoints/storage gets implemented sooner rather than having to wait several more years.
Yes, I hope that too. On top of hoping, one could donate, to slightly contribute to paying the developers that work on it.
Their first cut at "working on it" is to require that we pay Signal to store our backups for us (45 days of media and 100MiB total is not a useful free tier; I have more than 1 GiB of messages/media spanning years), when that's an entirely unnecessary restriction.
I don't know what you do for a living but it's very common when writing and releasing software to do it in phases. Earlier phases have a restricted feature set and feedback from the field/customers/users experiencing earlier phases informs choices in later phases.
Unless you have direct insights into their dev process, your claim that the restriction be "entitely unnecessary" seems overly strong.
remember when he published the actual algorithm for twitter and everyone could look at the code that was reportedly boosting the woke and canceling maga, and it turned out not to have any kind of censorship regime or boosting, except it boosted his account? That was a great time.
have you reread Twitter files after it was debunked and defanged.
it's amazing how bad information fluffed up by the angry news media gets thoroughly routed in people's mind, then gets quietly defanged and drilled to earth to the point it's a completely innocuous gossip chain.
see currently the Biden impeachment and Russian propaganda laundering.
people really want to believe salacious stories that cut against popular perception to the point they just get to keep riding that wave of mental backflips to suit their increasingly deranged phantasm.
And also it wasn't "the algorithm" at all but just some boilerplate python you use to feed stuff into the actual model, literally "let's put an ML model here" 101 kinda stuff. It was such a dumb stunt.
Eton group, and a prep school in an even more anachronistic vein. Our Greek textbooks had doodles and inkblots from bored kids who had died a century before I was born.
I think 90% of the blame lies with the British public school system, 10% with a very low quality pen, or maybe manufacture was worse a few decades ago.
Caning ended at least a decade before I was at school in England. The teacher when we were 7 or so showed us how to write nicely with the fountain pen. Boys flicking ink happened very rarely. I don't remember any intentional damage.
(Foreigners: "public school" in Britain means those extremely expensive schools that look like Hogwarts but have teachers more deranged than Snape and children more malicious than Malfoy. I'm amused by the anecdote about the textbook, as that's a plot point in Harry Potter.)
That last part would indicate that the restrictions are utterly moronic if one were to reason. At the very least they need to be relaxed so technology that's readily available to end consumers worldwide isn't restricted.
ITAR blocks export, not import. It has nothing to do with TSMC building fabs here. $50B in incentives might have something to do with that.
And like sister comment says, ITAR is not about industrial or economic policy. It’s about maintaining a qualitative edge in weaponry. How is there a qualitative edge when you can buy the restricted components freely from China?
Just noticed the other comment on GP saying they’re restricted through a different list (EAR). It serves the same purpose so I’m leaving my response as is.
Unikernels don't work for him; there are many of us who are very thankful for them.