Webpack & Preact-CLI Vulnerability

Some users susceptible to undetectable man-in-the-middle attacks over HTTPS on public WiFi

🕵️ How it was discovered

After live streaming Addy Osmani’s great Google I/O 2017 talk on advances in Progressive Web App tooling and technologies, I was very excited to take Preact-CLI for a spin.

Addy Osmani introduces Preact-CLI at Google I/O 2017

Part of this tool involves serving the app up over HTTP/2. Although the standard doesn’t truly require encryption, TCP representatives from all major browser vendors have stated the intention to only implement HTTP/2 for TLS connections. Effectively, this means you’ll only see it work over https:// URLs in practice.

Because TLS is required, and because unlike service workers, we don’t get special treatment of localhost for development purposes, a certificate (specifically, the private key) must be used to encrypt HTTP responses.

Once I started Preact-CLI and saw a bunch of security warnings, I started to take a deeper look at what was going on.

☠️ What’s the problem?

Preact-CLI, perhaps in an effort to save developers from having to generate their own certificate, included a certificate (and corresponding private key) in the project itself. Upon notifying Jason Miller about this problem, he brought to my attention that this same pattern had been followed in webpack-dev-server.

This would be a serious security issue in and of itself. However, upon closer inspection of the certificates in question, they turned out to be certificate authority (CA) certificates (and corresponding private keys). I’m not sure why the choice was made to generate this type of cert, but the end result is that, instead of the vulnerability applying to a particular domain, it applies across all domains.

🤷‍ Who is affected?

Users who trust the certificates built-in to preact-cli and webpack-dev-server

⚠️ What could someone do with this, and how easily?

As a result of this vulnerability, an attacker could very easily and reliably eavesdrop on, and tamper with HTTPS traffic across ALL DOMAINS, undetected. Essentially HTTPS is completely compromised, and no data is secret or safe anymore.

To do this, they would need to perform the following sequence of actions:

1. Use the private key for one of these certificate authorities to sign forged certificates for one or more domains (i.e., *.google.com , *.paypal.com , etc…). For those who trust the CA certificates in question, any certificates that the CA signs, will also be trusted.

A certificate signed by a trusted certificate authority is also trusted

2. Audit WiFi traffic to determine public WiFi network names that targets have joined before, and will automatically connect to again. Pick a network name (i.e., Airport Free WiFi)

WiFi devices broadcast networks they’ve joined before. We can collect these in order to stage our attack

3. Create a WiFi network with this network name, and bridge it with a network that has an internet connection. There’s a high likelihood that the target’s devices will automatically join this network without any action being required on their part (this is the default behavior of most WiFi-enabled devices)

WiFi clients will join networks they’ve joined before, with no interaction required on the part of the target

4. Stage a man-in-the-middle attack, using the generated certificates to provide (apparently) valid HTTPS responses. Software such as SSLstrip or SSLsplit makes this trivial.

No action on the part of the target is required. Once we reach step 4, all HTTPS traffic, across all domains can be both observed and/or tampered with.

🛡 What defenses do affected users have against this?

The defense against this kind of issue would be HPKP (also known as public key pinning), but for reasons I’ll go into in a moment, it wouldn’t help in this specific case. The idea behind HPKP is that an HTTP response header can inform browsers as to what the expected certificate should look like

Public-Key-Pins: pin-sha256="<pin-value>";

max-age=<expire-time>;

includeSubDomains;

report-uri="<uri>"

The field that’s most interesting here is pin-value , which can be one or more public key fingerprints. Modern browsers can use this information to validate whether a certificate is the right one, going far beyond the typical stringency (any certificate for the right domain, signed by any trusted authority).

The important idea here is that the browser remembers these fingerprints for the prescribed amount of time, so it’s not something that can be simply defeated by rewriting headers as part of the MITM attack. Of course, browsers have to get this information in the first place (similar to HSTS headers), so users would only be protected for return visits to HPKP-enabled domains they’ve already seen before.

HPKP is not widely used. A survey conducted in August 2016 found that, of the top 1 million sites (by alexa rank), only 375 implemented HPKP headers, with another 76 using it in “report only” mode.

In this case, however, because the certificate authorities in question are user-signed, HPKP validation won’t save us.

From MDN’s article on HTTP Public Key Pinning (my emphasis):

Firefox and Chrome disable pin validation for pinned hosts whose validated certificate chain terminates at a user-defined trust anchor (rather than a built-in trust anchor). This means that for users who imported custom root certificates all pinning violations are ignored.

⚒ If I’m affected, how can I fix it?

I’ve made a DETECTION PAGE you can visit to find out whether you trust any of the bad certs in question.

Detection utility

If it turns out that you’re affected, you’ll be presented with instructions for how to “un-trust” a certificate. You should visit the detection page again after restarting, to ensure you’re no longer affected.

👍 What’s a better way to handle certs in tools?

Both projects in question have solved this problem in a similar way: certificates are generated on a per-machine and an as-needed basis. This ensures that while individual developers may all share the same tool, they all have their own certificates, with private keys under their own control where nobody can get at them.

webpack-dev-server took the approach of handling cert generation themselves, while Preact-CLI builds on an approach under discussion in the ember-cli project, where a standalone library called devcert handles certificate management.

The advantage of an approach like this is that each project that has a need for TLS doesn’t need to handle cert generation on its own, and managing certificates (i.e., regenerating them when they expire) can be done outside the context of any one tool.

⚖️ What are the takeaways we should remember?

Never distribute certificate private keys in software projects , not even for development or internal purposes.

, not even for development or internal purposes. It’s sometimes difficult to truly understand how a private key could be used by a malicious party, but the potential for damage is usually far more severe than is initially apparent .

. Be extremely skeptical when asked to trust certificates , and flat out paranoid about trusting a certificate authority.

, and flat out paranoid about trusting a certificate authority. Essentially, anyone with a private key to a CA you trust can render HTTPS completely useless for you, across all domains. The nature of this attack makes it almost impossible to detect until it’s too late.

😱 Holy shit, this is scary.

If you‘re surprised at how something seemingly benign can be used to do so much damage, you are not alone. Security is one of those things that’s not as directly connected to business value, compared to features, performance, usability bug fixes, speed to market, etc…

As a group, web developers have far less fluency in security, compared to engineers in other areas (ops, DBAs, infrastructure engineers, etc…). Plenty of really smart people make security mistakes on a regular basis, simply due to lack of knowledge regarding risks, tactics and best practices.

If you’re interested in learning more about how to keep your users (and your business) safe, I’m partnering with Frontend Masters for an 8-hour hands-on workshop on web security topics. We’ll learn how to stage attacks that exploit vulnerabilities like the one reported here, and what we can do to defend against them.

I also am available to meet with your team for a personalized training on these same topics, and more!

📅 Disclosure Timeline

The webpack team and Jason Miller have shown how much they really care about their users, in their swift and direct response to this issue. This kind of problem no doubt exists in other projects, and I’m glad to have worked with this group to get some high-impact fixes in place as an example to follow.