Heebie Jeebies

A.k.a. jitter, that sneaky little digital nasty. It's caused by timing errors in the bitstream. Many people have tried to explain how it sounds using exquisitely detailed descriptors, but end up saying little more than, "it don't sound no good." Problem is, no one actually knows what jitter sounds like. So, what is jitter anyway? A digital-to-analog converter (DAC) works by reconstructing the waveform from a stream of bits. These bits are a digital representation, or numeric code, of sound waves. The playback timing from bit to bit is important for an accurate reconstruction of the electronic signal. Some DACs take the incoming stream and lock to its timing. Timing errors in the feed are passed on in the conversion process. Other DACs take the bitstream and re-clock it, thus any timing errors present with the incoming stream are corrected. In either case, the timing would have to be way off for jitter to cause any audible problems. (Jitter is measured in picoseconds, that's trillionths of a second.) Plus, even in common consumer DACs found in everyday portable electronics, jitter is below the noise floor level, meaning it's masked by noise in the electronics. To add to the confusion, the audible differences among digital players and DACs, even the most expensive, are mostly attributable to the analog circuits rather than to the DAC chip.

Oh, but the opinions about digital players, disc readers, and DACs are all over the place. Not everyone could be imagining these dramatic disparities. Could they? Over-sampling, up-sampling, non-over-sampling, synchronous or asynchronous, reclocking, non-reclocking. . . It's enough to give you a headache. And anyway, bits is bits, no?

Well, all bits are created equal and logically every DAC converts the code exactly the same. In principle they do. But that's not the whole story of digital playback.

I had rarely given disc readers, DACs, or high-end digital players a second thought, until, the day I was playing a new CD on a Sony DVD player. It didn't sound quite the same as the previous day when I had played the same CD on my Denon CDP. The CD sounded brighter on the Sony! Were my ears playing tricks on me? I immediately did some A/B comparisons, and to my great surprise, the Denon sounded warmer, smoother, less bright. This took focused attention, but it is a large enough difference to be consistently heard. Mind you, it's not a dramatic, night & day contrast, but it is noticeable. All the other possible variables are the same, so this can only be attributed to the players. Both are connected to the same preamp, via RCA cables, running to the same amps and the same speakers. On its own, the Sony sounds excellent, although, in comparison I prefer the Denon, yet I can't say with certainty that it is better. In the past, I had not tried to listen for anything like this, because variations from one recording to the next obscure any small discrepancies, making them barely significant, but also because I've not had two units simultaneously in the same system to make a fair, honest comparison, with one exception. A few years ago I did a lengthy comparison between a Sanyo and a modified Oppo. Conclusion from that test—no audible difference, even though at the time I was expecting to hear something distinct. This time, had I not heard the same CD only the day before, it might not ever have been noticed.

So you say, "Duh! 'toldja so."

This surprise had to be verified. I needed to double check my ears with some measurements. Took carefully matched wide-band pink noise readings of the Denon and the Sony. Here are the overlaid graphs.

The bright green and bright red areas show the difference in output of the Sony—green higher, red lower—contrasted with the Denon. The raggedness is due to this being an in-room measurement. Nevertheless, it shows the deviation between the units.

As you can see, from 1.25 kHz through 16 kHz, almost the entire upper midrange and treble, the Sony is across the board 1-2 dB higher. Below 1k it's fairly close with no broadband deviations standing out. This confirms what I heard. The Sony is in fact brighter. But this kind of change cannot be caused by the DAC. No DAC can reinterpret levels; they are fixed in the code. Of course, this single measurement doesn't take into account other differences that may exist in the DAC's function. I am not going to claim that all DACs convert the code pristinely, precisely, perfectly in all respects exactly the same way. What this example does show is a linear deviation, very likely an effect of the analog reconstruction filter, which makes the audible difference. This also gels with what Jim Taylor asserts in his book, DVD Demystified, that the analog circuits are the primary contributor to the audible differences among DACs. Wouldn't it be informative to see someone take a few DACs, run them through the same analog output, then listen and measure the changes? From this experiment we could determine what differences can be heard among the various conversion technologies and conclusively determine whether it's the DAC or the analog stage that matters most.

Resources :

EarLevel features no nonsense, no rambling articles. Direct link to the Jitters.

Site with a plethora of articles on digital : Digital Domain.

DVD Demystified, Jim H. Taylor, McGraw-Hill, 2006.

(|| ) Audio Home
© 2008-2021 Parallel Audio all rights reserved