I was alarmed recently to learn that some participants on an audio forum were claiming a subwoofer I'd measured had "god-awful" distortion. It wouldn't have bothered me if the subwoofer actually had "god-awful" distortion, but this sub is one of the best I've measured in its price and size range. I can't blame the enthusiasts on that forum, though, because anyone outside the handful of people who do subwoofer output measurements would have thought the same thing.

The problem arose when I shared my CEA-2010 subwoofer output measurements with the manufacturer, and the manufacturer, with my permission, shared them with another website. (I happily share my CEA-2010 measurements because I want to promote the use of output measurements in subwoofer reviews.) The confusion arose because the other website published my distortion numbers, rather than just the maximum output in decibels. And if you look at subwoofer distortion numbers, they can indeed look "god-awful."

Just some quick background: CEA-2010 measures the output of a subwoofer at different frequencies of sound. The volume is raised until the distortion exceeds a certain threshold, then the volume is backed down slightly, and the level is recorded in decibels. Here's an example of CEA-2010 measurements in a review, and here's an in-depth explanation of the process.

That dB number is usually all you need to know, in my opinion, but the software program that most CEA-2010 practitioners use also tells you the total harmonic distortion percentage (including second through fifth harmonics), so I jot that down for my notes. I occasionally mention certain distortion numbers in my reviews in cases where I feel they illustrate something important, but I don't generally include them because I worry that the casual observer might glance at them and assume that the sub is "god-awful." That's exactly what happened in this case.

The percentage of THD allowed by CEA-2010 isn't exact because the method sets different thresholds for different harmonics (the higher and more audible the harmonic, the lower the threshold). But usually the average THD allowed by CEA-2010 is around 30 percent.

"WHAT???" I can imagine some enthusiasts thinking right now. (Sorry if I just made you spit a mouthful of coffee onto your computer.) That's because the distortion numbers we're used to seeing are for electronics, such as amplifiers and preamps, that operate across the entire audio band. For an amplifier, one percent THD is a lot; a generally accepted threshold of audibility for amp and preamp distortion is 0.5 percent. But for subwoofers, it's 10 percent. I've confirmed this a hundred times over when doing CEA-2010 tests. Usually I can start to hear the distortion when the volume is a couple of clicks down from what's required to break one of the CEA-2010 thresholds, which typically means about 10 percent THD.

Why is distortion so much harder to hear in subwoofers? The answer is actually quite simple. Harmonic distortion is the creation of false harmonics--due to, for example, an amplifier being driven beyond the voltage capability of its power supply, or a speaker cone being pushed to the limits of what its suspension will allow. This converts what should be a nice, clean waveform into a "clipped" waveform, with a flat peak where there should be a rounded peak. The more square the waveform becomes, the more spurious high-frequency energy is produced. You can hear an example of this in the YouTube video below:

If you're testing an amplifier with a one-kilohertz tone, the second harmonic is at two kHz, the third is at three kHz, and so on. This is right in the area where the human ear is most sensitive. But if you're testing a subwoofer with a 50-Hz tone, the loudest distortion harmonics are at 100, 150, and 200 Hz, a region where the human ear isn't very sensitive. Thus, what would be extremely audible distortion in an amplifier may be completely inaudible in a subwoofer.

Some subwoofer manufacturers might seek to limit their products' distortion to, say, five percent. On the surface, this might seem like a good idea. However, the way they're achieving this low number is by setting the limiter in the subwoofer's amplifier conservatively. This can create its own set of problems because the other speakers in the system aren't governed by a limiter unless they're active (i.e., internally amplified) designs. On a loud action-movie passage, the rest of the speakers might be blasting away, while the subwoofer reins itself in. So, everything above 80 Hz or so is playing at a high level, but everything below 80 Hz is being clamped by the subwoofer's limiter. You'll get thin sound, which is far more sonically objectionable than a little bit of subwoofer distortion.

Of course, if we're talking a big subwoofer that delivers 118 dB at 50 Hz while staying under five percent distortion, you probably wouldn't hear the problem I'm talking about above. However, unless you're a total bass maniac, you're probably not going to play the sub much louder than that anyway, so it seems to me there's little point in setting the limiter so conservatively--except for the purposes of publishing really low distortion numbers.

Bottom line: When it comes to subwoofer output, the dBs are mostly what matters.

Additional Resources

� The Pros and Cons of Multiple Subwoofers at HomeTheaterReview.com.

� What's the Ideal Speaker Driver Configuration? at HomeTheaterReview.com.

� Check out our Subwoofers category page to read our latest reviews.