Your computer isn’t secure. Those of you reading this from your fortified Plan 9 Tor Box can stop reading here, but for the rest of you, it’s simply true. Your computer is riddled with security vulnerabilities, and so is your phone. If an attacker wants access to your machine, or if you download even one piece of software that either is or is carrying malware (see: any download from cnet.com or its ilk), you’re in an enormous amount of trouble.

I wrote an article a few weeks ago discussing this issue, and why it makes keeping large quantities of Bitcoin so dangerous. Today, we’re going to spend some time digging into one possible solution to this problem, and what still needs to be done to make it truly viable. The proposed solution is straightforward: instead of trusting a big, unweildy, and complicated operating system, with networking capabilities, which can be compromised remotely or via an app install, what if we instead made dedicated hardware devices, with no networking, which can sign Bitcoin transactions locally, without disclosing the key to the computer it’s connected to. No app installs and a minimal code base with a carefully structured interface means you can make security near-perfect on the device, which means that no breach of your computer can possibly disclose your keys to your attackers, which is a wonderful security guarantee that you’ll get hardly anywhere else.

So what’s the problem? Well, this is where we get into the nitty gritty practicality of the thing. Right now, one of the most popular hardware wallets is Trezor, a $119 device available from www.bitcointrezor.com. The source code is open, meaning that you can go through it to make sure you trust it (or read analyses from much cleverer people who have done exactly that). So, again, what’s the problem? Well, unfortunately, the firmware running on the Trezor isn’t the only piece of software that matters. The bootloader, as users have discovered, is closed-source (the bootloader is the simple piece of code which loads and sets up the rest of the software to run). The bootloader, by necessity, has access to all of the information in the software, including the private key, and, because it’s closed source, could be doing pretty much anything with it, a huge security oversight.

When asked to rectify this, a Trezor employee responded,

“There is no security reason why bootloader should stay closed, but we were quite hesitant to open it because that’s the last piece of mosaic that our competition is missing from making a perfect TREZOR clone.”

This is an understandable motive – any hardware company lives in fear of cheap imitators. However, it’s also wrong to ask users to entrust their money to code that hasn’t been through exhaustive review by the open source community. The entire purpose of hardware wallets is defeated if you can’t look at all of the software running on them. The future of hardware wallets is either companies willing to take the risk of copycats and open source everything, relying on their reputation to provide value to users, or entirely open-source projects, in which hardware companies develop only the hardware, pulling all relevant code and drivers from well-vetted open source repositories.

These are not styles of business that the tech industry is accustomed to, but they are the only ones that are trustworthy and valuable in the long run.