The “Top-Secret iPod”

A custom iPod was built without a paper trail “under Steve Jobs’ nose”, made possible because Apple did not yet sign iOS at boot. Throws the debate over code signing into pretty sharp relief. Of course personnel could compromise source code from the inside, and there’s the danger of Apple simply authorizing bad actors even if only under duress of coersion, but I find it pretty hard to have any patience for arguments against Apple requiring signed code in light of this harrowingly pernicious object lesson for anyone with eyes to see:

2 Likes

Yes, OMG, it’s horrible… someone could re-purpose a general purpose CPU in new and exciting ways… the freaking horror!

1 Like

The framing of this as innovation of 3rd parties rather than protection of users from hostile parties such as state actors and surveillance capitalism/authoritarianism is as wrong-headed as Leo’s recent cheer-leading for TikTok, which I was relieved to hear Amy Webb disabuse him of on TWiT #784.

Great read! It’s killing me not knowing what they were building. I’m skeptical of David’s Geiger counter theory.

I really don’t think code signing would have made any difference to these guys considering they were working with the org that does the signing!

I wouldn’t have thought so, either, except the author says that that’s why toward the tail end of the piece, and at the beginning specifies they were always kept outside Apple’s fire-wall and denied source-code; maybe they’re mistaken, but it does hilight the radical security rift absence of signing represents. The thing is, we need that caliber of authentication to be in open standards and not just exist accessibly to users from under proprietary control!

Yes, I really think that all CPUs should only run signed code… that way we could get rid of the scourge to earth that is Linux. /sarcasm

@PHolder, I hope what I said above, that we need this caliber of authentication, convinces you that Linux itself would be stronger for implementing the same degree of it itself, independently. Librem does a decent job, from what I gather: should they be disallowed from enforcing the same or functionally equivalent on their platform?

I don’t think code signing is anything but a monopolists dream. Look at the places where it is most successfully used to control and constrain users: console gaming, Apple iOS (and watch and TV, all variations on a theme), Caterpillar Tractors, Volkswagen emissions controls (allegedly), video card firmware and SoC video chip firmware.

That latter one, the SoC video drivers are not available for Mali-T860MP4 in my RockPro 64 so I am unable to get a fully working Linux loaded on it because the drivers are a binary blob and not open, keeping a number of distros far away.

Each of us must maintain a “monopoly” on our authorizations and the identities through which those flow: the point is democratization of the protocols, not keeping them out of proprietists’ hands. The examples of greatest success will be proprietary until and unless made available to the free and open.

This would violate V3 of the GPL: in essence anyone should be able to see the all parts of the “product” and anyone should be able to build it and issue a replacement version.

The license EXPLICTLY calls for being able to replace code with your own: The GNU General Public License v3.0 - GNU Project - Free Software Foundation

I’ll quote the relevant part of the page:

Yes, @PHolder: what I advocate is that users’ privacy and freedoms from be on equal footing with our freedoms to, hardware being pragmatically the premiere locus of those freedoms. Librem, for example, being open hardware, must extend its open-source code-signing verification facilities to all vendors, but those vendors must sign.

Open source is about people, not vendors. A person would need to be able to sign a specially crafted version of their own unique build of an OS and load it into their signing enforced hardware. Since any individual could need to request such event, any request for signing from anyone would need to be honoured, including the makers of malware or worse… thus fully voiding any advantage to having anything signed. checkmate

Am I to understand you are against signing even for hardware? How, then, to secure it? In a scenario of local fabrication, it would, indeed, be the user who crafts both hardware and software as you’ve described, eliminating the conflict between their own and any vendor’s authorization. My point being: ultimately, the power you fear of a signing being denied is the power the user conserves unto themselves against others themselves unwilling to identify and vouch for themselves similarly. As you say, it is about people: it must be trust inducing the user to install, yet what they are installing must have been correctly identified for that trust not to be misplaced, and as I see it, signing facilitates the verification of a good’s source.

There has yet to be a hardware security DRM scheme that has not been broken (at least minimally.) The consoles do a pretty good job, but in the end even that DRM falls. Even a large number of Apple devices have recently had their bootloader broken.

Not all of these are totally current, but it shows the trend:



So near as I can tell, it’s a fools errand to try to lock down an OS… leading you to a false sense of security, while at the same time making it impossible for the little guy to do fun experimenting (aka Homebrew)

I don’t see code signing as digital rights management in the way you cite because signing and authentication need not be employed for corporate proprietism (FWIW, vendors like Apple and Librem must be pressured to grant operability to free software, but again that free software protects users if it participates in that signing process). I don’t blame you for seeing this through the lens of corporate control and I agree with you against most of what you oppose, but I think more important lenses both ultimately and pragmatically are those of security and user protection. My rubric for this is to view it as a subset of the liberties encryption must afford users, undergirded by authentication of identity and verification of goods, be it in the context of hardware or software, proprietary or otherwise. Casting it in more trenchant terms to conclude it’s all foolhardy handcuffs IMO spells disaster for a majority of users while both destabilizing and weakening the ecosystem and its community by curtailing liberties such safeguards would better have preserved and even facilitated.

Locking it down can only have one effect: allowing less choice. As soon as it gets locked down, someone wants to be in control inside the walled garden, and will try and decide who gets in and who doesn’t. As an example: there is a program that can make Windows 7 act like it is a legal version even though it might not be. Microsoft’s anti-malware will flag and delete this program without warning, because they view it as a threat to their monopoly.

I would encourage you to be very wary of what you wish for… frequently it doesn’t look like what you thought you were dreaming of. For example some were dreaming of a “secure and friendly” phone… and they got an Apple walled garden instead.

What I’m trying to get across to you is that there will always be a walled garden if there is to be security. The issue is putting the user in control of it, not championing the abandonment of security as ostensible user freedom. You seem to be assuming corporate control of the means of producing hardware. Be it a for-profit corporation like Apple or Librem, or a fab-lab, the principle that the user should maintain command of their machine is facilitated by signing, so long as the user and not the corporation determines what to sign. In the medium-term, the project of empowering users in this way is far from accomplished, and so on balance Apple is, in my view, better for users to not abandon them to exposure to actors who bear them ill will or could bring about grievous and irreparable harms to them. Apple users would do well to shepherd the progress of open-source and free infrastructure, as you note.

Repeat this with me until you understand. It’s a “law” like “Murphy’s law”…

Singing anything implies a division between “good” (signed) and not known to be good (unsigned.) The average user is simply unskilled at deciding what belongs on which side of the division… if this were not true then there would be no ransomware.

Chrysler started digitally signing their automobile engine control units back in the late 2000’s or early 2010’s. I’d bought a 2011 Challenger R/T and was not amused to learn that there was no way for me to modify the fuel mapping because they’d made this decision. I even went so far to consider legal options out of principle (sadly my wallet is smaller than my principles). I sold the car and haven’t bought another Chrysler since.

My point is, there are scenarios where applying this kind of security can only result in harm to consumers for the benefit of a corporation. My ECU is not a connected device and not vulnerable to passive exploitation, and it’s improbable that any bad actor would target my car specifically for any reason. It’s a level improbability that I’m comfortable with. I’d go so far to say that the majority of instances where this security tactic would be used on consumer products would prove zero benefit to consumers. Maybe someone who feels they are a high-value target, perhaps a political figure, warrants such protection and should have the option, but not me or you.

I agree that the standards should be developed so they can be applied where appropriate, but before that discussion can even begin, we need requisite protections for consumer rights absolutely nailed down.

You’re stuck on its current incarnation. In a context of consumer empowerment which is in any case required be that through government mandating that signing not be denied certain parties or through trust, signing provides authentication and, with it, a measure of security you yourself note consumers need. My ultimate point is that signing in principle is the bare minimum users deserve to know that the machine or device they’d run their software on is at least as secure as that software. As a price for a product to a corporation, yes, it’s a bad thing, but without, under those conditions, is worse, even though we must move away from a price/product/corporate paradigm to a free and open/-source one, and in the meantime defend user prerogative without throwing the security baby out with the libertine bathwater.

I fully agree with @newman that the cart is put before the horse sociologically granting corporations signing power until anad unless consumer protections are firmly in place, but the catch-22 of that is until they are, lack of it leaves everyone even more wide-open to the surveillance-capital state and worse, as illustrated by this story of the secret iPod. That is why, as a pragmatic matter and in select circumstances for select parties, I find it worthwhile to profer the benefit of signing. That doesn’t make me answerable to abuses of it, and the argumentation around this should be careful to preserve the power of signing for private, secure use by private individuals. Obsession with the rife abuse by corporations undermines user empowerment if allowed to dominate the discourse completely.