According to this post on the official V8 Javascript blog, the pseudo-random number generator (PRNG) that V8 Javascript uses in Math.random() is horribly flawed and getting replaced with something a lot better. V8 is Google’s fast Javascript engine that they developed for Chrome, and it’s used in Node.js and basically everywhere. The fact that nobody has noticed something like this for the last six years is a little bit worrisome, but it’s been caught and fixed and it’s all going to be better soon.

In this article, I’ll take you on a trip through the math of randomness, through to pseudo-randomness, and then loop back around and cover the history of the bad PRNG and its replacements. If you’ve been waiting for an excuse to get into PRNGs, you can use this bizarre fail and its fix as your excuse.

But first, some words of wisdom:

Any one who considers arithmetical methods of producing random digits is, of course, in a state of sin. For, as has been pointed out several times, there is no such thing as a random number — there are only methods to produce random numbers, and a strict arithmetic procedure of course is not such a method.

John von Neumann

John von Neumann was a very smart man — that goes without saying. But in two sentences, he conveys something tremendously deep and tremendously important about random variables and their mathematical definition. Indeed, when you really understand these two sentences, you’ll understand more about randomness than most everyone you’ll meet.

Random Variables

The first thing you learn in an advanced probability course is the strange, but profoundly important, way that mathematicians think about randomness. And once you’ve got the concept, you’ll cringe a little bit every time you hear someone say the phrase “random number”. It’s not pedantic either. It’s fundamental.

Numbers aren’t random. Period. We all know what numbers are. We use them to count things, and we’ve extended them to uncountability and even irrationality, but the one thing that a number isn’t, is random. Seven is seven, and it’s the same seven that it was for Aristotle and will be for the rest of time. It’s this non-randomness that makes numbers useful for counting things, after all.

To get a concept of randomness into mathematics requires a function. And functions spit out numbers, but the numbers themselves aren’t random — they’re “outcomes” of a random process. The randomness enters the function on the input side. A mathematician says that a “random variable” is a function whose value depends on some state of the world as it evolves over time. If the relevant state of the world at time t is S t , then a random variable x t =f(S t ) .

If the state of the world evolves unpredictably from now (time t ) until tomorrow (time t+1 ), then the value of x tomorrow will be unpredictable. If you can say something about the probabilities of which states the world will be in, then you can also assign probabilities to future values of x .

If I’m rolling a die, for instance, I can be pretty sure about the respective probabilities of the relevant states of the world — which side is up — but I can’t say whether I’m going to roll a three or a four tomorrow until tomorrow comes. And then once it’s rolled, the outcome is a simple number, and four is four forevermore.

And this is where von Neumann comes in. “…there is no such thing as a random number — there are only methods to produce random numbers…” And if the method is known, if the means of getting from S t to S t+1 is mathematical and thus perfectly predictable at time t , there’s no way to make tomorrow’s outcome unpredictable. QED, slam-dunk.

PRNGs?

If you can write down how S t evolves over time, then your function isn’t random, thus all of the computer-implemented PRNGs aren’t random either. (That’s the “pseudo-“.) Then what are they? What should they do, if they can’t produce randomness? Here’s a list of three criteria, where each one builds on the previous ones.

The minimum thing you’d like a PRNG to do is output the whole set of possible values. If you’ve got an 8-bit PRNG, you want it to be able to take on every value from 0 to 255. Building on this, you want each possible outcome to pop out with equal frequency (for a uniform PRNG). And finally, you want the outcome of the next draw ( x t+1 ) to be unpredictable given that you’ve seen a whole bunch of previous draws.

A PRNG that covers all possible values is said to have “full period”, and one where all outcomes occur with equal frequency is “equi-distributed”. These criteria are fairly straightforward to work out in math or to test — just take a lot of values and see if there are too many of one or none of another. If a PRNG does’t cover all the values, it can’t really be evenly distributed. If it’s not evenly distributed, you’ll be able to have a limited kind of predictive ability; if five comes up too often, just predict five.

Predictability can be even more subtle, though, and there are a bunch of interesting statistical tests. Or course, “predictability” is a bit of a misnomer. We know how the state updates, so the word “predictability” only really makes sense if we pretend that we don’t already know S t+1 , and only focus on the history of the x values. We’re in von Neumann’s “state of sin” after all.

And Javascript

Which brings us to V8 Javascript’s Math.random() . For the last six years, the algorithm that’s been used has been pretty horrible. So think of our three criteria presented above and have a look at the following plot of its output. (Or look up at the banner again.) Which desired criteria fail?

If you said “full-period” you were right. There are holes where random numbers just don’t occur, although a conclusive test requires more than a plot. And if you said “equi-distribution” you were also right. Have a look at those dark bands. Those are numbers that occur more frequently than they should.

Finally, if you said “unpredictability” you were mostly right. The “good” news is that the bands are basically horizontal stripes, which means that even though some y-axis values are over-represented, they don’t seem to depend on the x-axis value. But because the y-axis values are unconditionally poorly distributed, you can predict in the regions with the dark bands, and your prediction will be better than chance.

So of three possible criterion to judge a PRNG, this one scores a zero, based on simply looking at a plot of some values. The code that generated the images is here. (Test your browser’s PRNG.)

To quantify the above observations, the official V8 Javascript post that I linked above notes that the coverage is only 232 values out of the possible 252 uniformly-distributed values that a 64-bit float can represent. This is a huge failure of the “full-period” criterion.

Additionally, there are short cycles that depend on the particular choice of the starting state. That is, for unlucky state choices, the period before the PRNG repeats is even shorter. Now 232 possible numbers seems like a lot, until you realize that you’re short of the available values by a factor of 220*. That is, you’re missing 99.999905% of the space. And the Birthday problem makes this shortfall a big deal.

Better tests of predictability in PRNGs will look at many higher dimensions, and test if the outcomes are dependent on each other at various lags and with varying amounts of previous output used to predict the next value. TestU01 is now the state-of-the-art in PRNG testing, and is easily downloadable so you can put your favorite PRNGs to the test if you’d like. Other, perennial favorites include the original Diehard battery of tests (get it?) and the improved Dieharder battery (will the puns stop?!). But this PRNG is so bad-looking on its face that there’s no need to beat the dead horse.

What Happened?

There’s a great writeup of the whole debacle in the blog of [Mike Malone], the CTO of Betable. His company used Math.random() in Node.js on their servers to assign per-session tokens to users. They thought they were fine, because the chances of a collision were vanishingly small if the PRNG was doing its job. They estimated that they’d have a one-in-six-billion chance of a collision in the next 300 years. (They’re wrong — they can’t get more values out of the PRNG than the full state cycle, which is 264. But they’re basically right in that we shouldn’t see a collision in our lifetimes.)

In fact, they had a collision in March on a system they rolled out in February. Oops! This lead [Mike] to do a very in-depth analysis of the flawed PRNG, which is worth a read. He also points out the prophetic comment from [Dean McNamee] on the code change that introduced the flawed PRNG:

I would have gone with Mersenne Twister since it is what everyone else uses (python, ruby, etc).

Indeed.

The story of the algorithm that got chosen, “MWC1616” is even stranger. It seems that [George Marsaglia], the author of the original Diehard tests above, and developer of the Mersenne Twister, posted up one version of this routine on Jan 12, 1999 and then an improved version on Jan. 20. People who read the thread to the end got the good one, and that includes Numerical Recipes and many others. Somehow, the poor folks implementing the PRNG for V8 Javascript just got the wrong one.

What’s Next?

But the bug was found and patched. In the end, V8 Javascript is going with an XorShift generator, which seems to be state-of-the-art and passes all of the statistical tests in all of the test suites we mentioned above. In addition, it’s extremely fast, requiring only a few bit-shift and XOR operations. It’s new, which is always a problem, but it tests extremely well.

XorShift128+ was also merged into Mozilla and Safari as well as Chrome. You should be getting better random numbers soon.

If you want to play around with these PRNGs, here’s the code for MWC1616 (no!):

uint32_t state0 = 1; uint32_t state1 = 2; uint32_t mwc1616() { state0 = 18030 * (state0 & 0xffff) + (state0 << 16); state1 = 30903 * (state1 & 0xffff) + (state1 << 16); return state0 << 16 + (state1 & 0xffff); }

and here’s the same for XorShift128+ (yay!):

uint64_t state0 = 1; uint64_t state1 = 2; uint64_t xorshift128plus() { uint64_t s1 = state0; uint64_t s0 = state1; state0 = s0; s1 ^= s1 << 23; s1 ^= s1 >> 17; s1 ^= s0; s1 ^= s0 >> 26; state1 = s1; }

And if all this pseudo-RNG stuff has got you craving for some real, honest-to-goodness, no-knowledge-of-the-state-update-function, randomness you’ve got a couple of good options: use radioactive decay or combine radio noise and quantum tunneling. Finally, if you’d like your randomness certified, check out the US National Institute of Standards and Technology’s Randomness Beacon.