At school I had a physics teacher who was extremely pedantic. Nothing would upset him more than if you used the wrong term to refer to something. Even small mistakes most people wouldn’t pick up on would incense him.

I didn’t really get why, and in fact we would take great pleasure in saying the wrong thing so we could watch his face turn red as he shouted “Gravity is not a force! It’s an ACCELERATION!”

But one of the distinctions he made has stuck with me. One that has turned out to be useful in many things in life, and not just physics. And that’s the distinction between accuracy and precision. While some people use these almost interchangeably, they have very different technical definitions.

Both apply to problems of measurement. In measuring things, we rarely have perfect tools, and most measurements will have some degree of error in them. Accuracy and precision are words we use to talk about different aspects of those degrees of error.

Accuracy vs Precision

Accuracy is how close our measurement is to the real value we’re trying to measure.

Precision is about the level of detail in our measurement.

It’s not immediately clear why these are different concepts, because in many cases they have the same magnitude. For example, measuring length with a ruler will let you read off an answer to the nearest 1mm (precision), which will probably be within 1mm of the real length (accuracy).

But this isn’t always the case. A watch might show the time to the nearest second (precision), but if you haven’t set it properly it could be out by a few minutes (accuracy).

It gets worse. When we start dealing with estimates and predictions in the presence of noise, these numbers can be much further off.

Estimation

For example, this article in the Telegraph claims that socialising increases our happiness by 6.38%. Yes, it might be true for the set of data points collected on the sample of people surveyed. But the decimal places hardly seem relevant, as the ‘real value’ (if there even is such a thing in this case) probably can’t be measured that accurately in a single study.

What’s happening in the Telegraph article, and in other places, is that by giving results to a high degree of precision, they create the illusion of accuracy and confidence in the results.

It’s not particularly damaging here, as there probably aren’t many people basing their actions on the results of that article. But what about more significant areas?

If inflation changes from -0.1% to +0.1%, does that really mean that something has changed in the economy or is excess precision in our measurement causing us to attribute meaning to noise?

Does it really make sense to give share price estimates to three significant figures, when the estimates vary from each other in the first figure?

There’s nothing wrong with predictions or estimates, but giving precise estimates and not stating the variance or the level of confidence can be misleading.

Application

If you’re making a decision based on data, don’t mistake precision for accuracy. If in doubt, ask questions:

How many measurements were used to come up with the number you’ve been presented with?

What is the range in the sample of measurements? What was the smallest value, and what was the largest value?

How close does the person presenting the data think it is to the real value?

The last question will force them to be open about the accuracy of their data, and should allow you to decide how much weight to place on it. Sometimes it can even be best to just ignore the number completely and focus on more important things for your decision!