The “bits is bits” argument is centered on a digital data stream, which is different from a digital music stream due to one fundamental component: time. We can’t emphasize this one enough—data and a timed music stream are not the same.





Isaac Markowitz of AudioQuest likes to put it slightly differently. The ones and zeroes, he says, are only half the information. The other half of the information is how far apart each of those ones and zeroes needs to be.





One second of stereo CD-quality music contains 1.4 million bits (44,100 samples x 2 channels x 16 bits per sample). These bits can be copied again and again with no errors whatsoever, because copying one second of music data can take 0.001 of a second, or 10 minutes, or even 0.00001 seconds for the first half, and two years for the second, and it’ll always end up with the same bit-perfect result. Time is irrelevant to a data stream.





Everything changes when you want to play that one second of music. Now, those 1.4 million bits need to be marched into a DAC at exactly 1/44,100th of a second intervals. There are no resends, no pauses—once you hit “play” it’s go-time.





If a DAC doesn’t receive any piece of data it’ll simply estimate what it might be, a process called interpolation. This is textbook distortion—the change in a signal from the original during processing. How much interpolation is happening in your DAC? Do you know? You won’t until you remove sources of noise, and your music sounds better.





Another form of distortion is jitter. It’s hard to transmit and receive a signal that needs to be accurate over millions of bits at a steady marching beat of 22.6 microseconds (and that’s just for CD quality, imagine 192 kHz, 24-bit music). If the clock starts too fast or too slow, or drifts, the samples will reach the DAC out of time, and result in distortion of the recreated analog sine wave.

