OK, everyone sit up straight and pay attention. Class is in session.

The video above is a technical primer on digital audio signal behavior. It’s over 20 minutes long, and it gets quite nerdy, but it covers all the basics of waveform knowledge. So hang in there and watch the whole thing — especially if you have any interest in media production, high-definition audio, file formats for digital music, or even if you’re just curious about how cold digital bits on a CD can arrive in your ears as sweet, savory Jimi Hendrix jams.

Audiophiles in particular should pay attention, because the demonstration goes a long way toward debunking a key myth about digital audio signals — that a digital waveform cannot adequately represent an analog signal because crucial information is lost in the sampling process. In fact, this widely spread misconception is what spurred this video tutorial in the first place.

The demo is hosted by Chris “Monty” Montgomery, creator the Ogg format, the Vorbis codec, and founder of Xiph.org, a non-profit community dedicated to open-source audio. Last year, Monty made some waves in the digital audio community with a lengthy post shooting down the belief (perpetuated by none other than Neil Young and Steve Jobs) that 24-bit/192kHz files can produce a sound on playback that’s “more pure” than 16-bit files sampled at 48kHz, or even 44.1kHz, which is CD-quality.

Say what!? Exactly. Which is why you should not only read Monty’s article and watch the video, but also join the discussion on Xiph.org’s wiki. For extra credit, watch the first episode in this series, “A Digital Media Primer for Geeks,” which covers video as well as audio. For extra-extra credit, perform your own experiments. The wiki page contains links to Xiph’s downloadable demos and a list of the equipment needed to reproduce these results at home.