Statistics Canada reported that employment rose by 51,000 in February.

These numbers seem to gyrate tremendously from month to month in a way that has little to do with economic fundamentals: jumping by 40,000 in December, falling by 22,000 in January, and now rising significantly.

How much confidence should we have in them?

Well, we should certainly trust Statistics Canada to produce unbiased statistics. No doubt about that. The methods used are impeccable, and if you don’t trust StatCan numbers then you certainly have no business trusting statistics produced by any other polling firm in the country, or for that matter anywhere.

But trusting the numbers to be unbiased is the not the same thing as having complete confidence in them. Indeed, the fact that its methods are impeccable means that StatCan knows exactly how much confidence we should have in the month to month changes.

Pollsters trying to gauge voting intentions use samples of only one or two thousand Canadians, and their results are routinely reported to acknowledge the uncertainty in extrapolating to the entire population: “accurate to within so many percentage points 19 times out of 20” is the commonly used phrase.

Statistics Canada surveys are no different: information on about 56,000 households is used to represent 28.5 million working age Canadians. Economic statistics have the same inherent uncertainty associated with making this extrapolation.

The Statistics Canada jobs report tells us that between January and February employment rose by 51,000, but also—if you look hard enough for the details—that this is accurate to within plus or minus 57,400.

If you wanted 95% certainty of not being mistaken, you have to accept the possibility that last month’s job change ranged from -6,400 to 108,400. If you were willing to be a little less confident, willing to entertain a 1 in 3 chance of being wrong rather than 1 in 20, then you would accept the possibility that the change ranged from 22,300 to 79,700. And if you wanted to be very precise and claim that the change was 51,000, then you have to accept being wrong pretty well with certainty.

Nate Silver, author of the best-selling book The Signal and the Noise, tells one particularly poignant story of statisticians who gave the impression that their forecasts had no margin of error.

In the spring of 1997 the Red River flooded the town of Grand Forks, North Dakota leading to an evacuation of almost all of its inhabitants, damage to the majority of homes, and billions of dollars in cleanup costs.

Yet the US National Weather Service did a pretty good job of predicting that the river would crest at historic highs, its best guess being 49 feet. The residents of Grand Forks were aware of this. The problem was that the statisticians did not communicate that their margin of error was 9 feet.

The Red River eventually crested at 54 feet, not far off the prediction and well within the margin of error. But the town’s levees had been designed to withstand a maximum crest of 51 feet, and no extra precautions were taken to handle the coming flood because the residents and town officials did not entertain the possibility that the forecast could have been off. Mr. Silver reports that the forecasters later told investigators “that they were afraid the public might lose confidence in the forecast if they had conveyed any uncertainty in the outlook.”

While lives are not exactly at stake when it comes to the Statistics Canada numbers, one can’t help but wonder if it doesn’t also hold this point of view. The very first line of its press release screams “Employment rose by 51,000”, encouraging readers to treat the numbers with complete confidence. Buried deep in the website is the information on the margin of error. The more accurate, albeit less attention grabbing, lead is that “in February employment rose by 51,000. This figure is accurate to within 57,400 19 times out of 20.”

The US Weather Service now communicates the uncertainty associated with its forecasts openly and honestly to the public.

Would our trust in the StatCan numbers be any less if the Agency clearly conveyed that it can never promise complete certainty? Probably not: knowing that we can’t have full confidence in the numbers would probably lead us to trust them all the more.