Lots of interesting details in this article, including:
- Apple unwittingly tried to hire David Wang, the creator of the exploit
- Wang instead went on in 2017 to co-found Corellium, a company specializing in providing "virtual" iPhones for security testing.
- Apple sued Corellium in 2019 for copyright violation. The discovery process turned up Wang and his work on the San Bernadino exploit.
This is how the article describes the exploit:
> Azimuth specialized in finding significant vulnerabilities. Dowd, a former IBM X-Force researcher whom one peer called “the Mozart of exploit design,” had found one in open-source code from Mozilla that Apple used to permit accessories to be plugged into an iPhone’s lightning port, according to the person...
> Using the flaw Dowd found, Wang, based in Portland, created an exploit that enabled initial access to the phone — a foot in the door. Then he hitched it to another exploit that permitted greater maneuverability, according to the people. And then he linked that to a final exploit that another Azimuth researcher had already created for iPhones, giving him full control over the phone’s core processor — the brains of the device. From there, he wrote software that rapidly tried all combinations of the passcode, bypassing other features, such as the one that erased data after 10 incorrect tries.
Also, since 2015 there's been speculation that the FBI's real motive for taking action against Apple was to set a legal precedent to make it easier to compel tech companies to unlock devices/accounts in the future. This appears to be confirmed.
Things I'm for: targeted, investigative police work, for a specific crime, where it's highly likely the warrant issued will find specific evidence, and the crime is of violent nature.
Things I'm against: Warrantless surveillance by the FBI, CIA, NSA, Google, Facebook, your cell carrier, and friends
Apple is suing Corellium, a maker of iPhone virtualization and pentesting software, because their tool can be used to develop exploits that bypass Apple’s security?
It's incredibly worrying that Apple is using the legal system to ban virtualization and pentesting software.
I am surprised and saddened that Apple is going down that route.
There is no downside for Apple. The legal costs are insignificant for them. They do not bear any extra punitive cost even if the lawsuit is meritless.
Meanwhile the PR damage or liability for having a security exploits running wild is massive. Corporate actions have no ethics or moral - it is simple bean counting for Apple. Any people who get part in this cannot lose their jobs no matter what is the outcome.
There are huge downsides to the community, what’s next someone will sue the maintainers of nmap?
If Apple wins the case it would be terrible for the infosec community, every business that got pwned due to a 5 year old wordless exploit will be able to sue Rapid7 which would quite likely mean that Metasploit will be too much of a liability to maintain.
If jailbreaking an iPhone is legal, which also strips away Apples "protections" via an "exploit" in Apple's hardware/software then why wouldn't this be legal as well? It seems a case could be made.
Apple insists jailbreaking is illegal, and in their initial complaint against Corellium they seriously tried to argue that its use by pwn20wnd to help him while developing unc0ver (a jailbreak) was an "unlawful end" :/.
> Apple is suing Corellium, a maker of iPhone virtualization and pentesting software, because their tool can be used to develop exploits that bypass Apple’s security?
Does Apple have legal standing here or are they just using their warchest ($) to intimidate others into not following Corellium's example?
themolecularman is specifically asking for their grounds, I think the "copyright violation" claim is just in order to start the suit, not that they actually violated Apple's copyright in any way.
2. iOS is only licensed for use on bare-metal Apple hardware
3. Therefore, virtualisation is copyright infringement
4. Corellium makes iPhone virtualisation software
Of course, many hacker types would argue if you've got a license to something there's nothing morally wrong with format-shifting it to your heart's content, regardless of what the license or the letter of the law may say. Apple is probably relying on the courts taking a less enlightened view.
i would think that research, including security one, is a fair use with the virtualization being just a research device/tool in this case. Something akin to making an otherwise illegal copy to a different, more lab hardware specific format. Like say you have a copyrighted picture and you make some blow-up or X-ray photo of it for the research.
Very similar in how running macOS on a Hackintosh is a copyright violation. You are only granted a license to the OS to run on Apple hardware. Running their OS in a virtualized environment not on Apple hardware is the same thing.
especially so because legally forcing someone to stop using some software/exploit won't stop governments nor hackers from exploiting the vulnerability.
They are attacking this from every angle. Lawyers will lawyer, the techies will techie. If they just change the tech to fix it, it becomes a cat and mouse game (already is). If they kick their butts in court and force them to a $100m pay, they will teach a good lesson these guys, and force the next ones rethink it.
Note that the promotion of this FBI-vs-Apple narrative benefits Apple.
The FBI didn't need to unlock the phone, most likely. All iPhones in their default configuration back up the ~entire contents of the phone to Apple each night, with Apple keys. Apple can decrypt this without the phone, the user, or the passcode at any time, invisible to the user.
Apple preserves this vulnerability for the FBI, at the FBI's request:
Apple turns over this data without a warrant frequently (over 30,000 users in 2019) according to Apple's own transparency reports. They also turn it over in response to warrants, as they have plainly stated that they did in this case.
You don't need access via the front door if you have it via the back door.
As someone who works closely in the field of Apple security issues, I strongly agree with your first statement--the narrative on this topic is often weirdly pro-Apple as this device effectively already had a backdoor that anyone could have used to get into it with nothing more than Apple's firmware signing key (an issue which was only addressed in later versions of this device)... Apple claims that the key wouldn't have been useful to the FBI without their expertise to help, but I can attest that that is BS as forensics people in the jailbreak community had maintained alternative ssh-only firmwares that provided the exact required PIN-code brute forcing tooling for years--but the rest of your comment (the chosen defense of your thesis) is sadly much weaker than it thereby could be because the FBI actually covered why iCloud Backup wasn't helpful in their court filings: if nothing else (there is a bunch of explanation of why a "forced" backup wasn't possible that is maybe "extraneous"), the device hadn't been backed up for months before the incident (with automatic backups turned off) and there is data they wanted (if it had been there) which is not part of a backup (specifically various caches).
I was speaking in general more than about this specific device. That's the mechanism of this narrative: Even the FBI couldn't get into the San Bernadino shooter's phone, so you're safe and secure if you buy an Apple device. This is false, due to iCloud Backup being on by default (and its insecurity being specifically and intentionally preserved by Apple, at the behest of the FBI, willingly and without legal compulsion).
Whether or not they got into the specific phone (or backup data) possessed by the San Bernadino shooter is irrelevant: that prosecution has almost nothing to do with the contents of their telephone(s).
The narrative that is being pushed in the media, as a result of all of this, is that Apple devices are safe and secure from government snooping, even with a warrant. We know this to be false, and more people should know this to be false.
It's even false without a warrant, as all iCloud Backup data is subject to FISA/PRISM at any time, and this comprises the vast majority of the data stored on all iPhones in the world, no probable cause needed.
The whole thing is an explicit manipulation of the media, undertaken Apple and the FBI in concert, to protect Apple's brand image. It appears to be working.
"Mozilla spokeswoman Ellen Canale said the company has no knowledge of any bug that was connected to the exploit."
Firefox OS phones didn't have Lightning ports, so it's not clear what code from Mozilla Apple would be using for that. It seems possible there was a mixup with some other MPL licensed software or the "NS" and "NSS" naming that both Mozilla and Apple use.
"Netscape Security Services" (NSS) is one of the Mozilla libraries that's (somewhat) widely used by other software, and would be a candidate, but I've never heard about Apple using it.
I don’t think this is true. John Mcafee did not have this capability (he’s a notorious liar), and it’s unlikely that anyone else did either. No one willing to say so in public anyway.
I guess that goes to how valuable will that information be in 12, 18, 24, 60 months. Even more embarassing for someone like the FBI beating the chests about the valuable information that they absolutely have to have access to, but to find out there was nothing of value on the device. The FBI definitely has egg on their face over the San Bernardino example
>Even more embarassing for someone like the FBI beating the chests about the valuable information that they absolutely have to have access to, but to find out there was nothing of value on the device.
That claim by the FBI was obviously made in bad faith to begin with for the San Bernardino shooting. Of course the shooter didn't store some kind of incriminating communications or plans on his work phone, there was never any rational explanation for why he would have done that. They destroyed their personal phones entirely but didn't even bother to reset their work phone. Add to this the fact that a law enforcement screw up is what originally locked them out of a recent iCloud backup to begin with and they already had access to everything else that was in iCloud before they reset the password. It was painfully obvious that they were pushing hard behind "But TERRORISM!!!" to either set a precedent or get a signed build of iOS that bypassed any protection from brute force attacks.
Or the simple hammer/rubber hose backdoor. I think encryption is only helpful over short windows. Meaning, delete anything you don't want found because it will get cracked at rest.
I read it for free this morning in a news aggregator before it was here on HN.
Also, if this is the work of the investigation by WaPo reporters, it would be better to post a link to WaPo than some other news agency reporting on a WaPo report.
It is the original reporting for this story. Paywalls are explicitly OK https://news.ycombinator.com/newsfaq.html and in any case uBlock Origin's default configuration seems enough for WaPo's paywall.
- Apple unwittingly tried to hire David Wang, the creator of the exploit
- Wang instead went on in 2017 to co-found Corellium, a company specializing in providing "virtual" iPhones for security testing.
- Apple sued Corellium in 2019 for copyright violation. The discovery process turned up Wang and his work on the San Bernadino exploit.
This is how the article describes the exploit:
> Azimuth specialized in finding significant vulnerabilities. Dowd, a former IBM X-Force researcher whom one peer called “the Mozart of exploit design,” had found one in open-source code from Mozilla that Apple used to permit accessories to be plugged into an iPhone’s lightning port, according to the person...
> Using the flaw Dowd found, Wang, based in Portland, created an exploit that enabled initial access to the phone — a foot in the door. Then he hitched it to another exploit that permitted greater maneuverability, according to the people. And then he linked that to a final exploit that another Azimuth researcher had already created for iPhones, giving him full control over the phone’s core processor — the brains of the device. From there, he wrote software that rapidly tried all combinations of the passcode, bypassing other features, such as the one that erased data after 10 incorrect tries.