Wednesday, June 17, 2015


I believe that jitter is not as big a problem as most subjectivist audiophiles believe.  For one thing, the best science, published by JAES, is that it takes jitter differences in the 10's of nanoseconds to be audible--barely audible.  Meanwhile typical equipment such as my Sonos Zoneplayers (not normally considered "high end") has about 220pS of jitter, at least 50 times less than what would be barely audible.  A few months prior to reviewing and measuring the Sonos Zoneplayer, John Atkinson tested a 2004 vintage DCS stack (about $80,000 or so) and found it had 230pS jitter, at 160 times the cost.

I found a related and interesting thread at DIYAudio about jitter.  The OP (now apparently banned) was saying basically that SPDIF is not the problem (well, it is actually) the problem is with poor receivers.  He showed several different receivers had very different levels of recovered jitter as appears on spectral graphs (he did not reduct it to a number).  Later posters claimed his worst case (a NOS DAC with 8 parallel 1543's--which the OP said was an audiophile favorite) looked bad mainly because of lack of digital filtering mainly and jitter had little to do with it.

Anyway, one poster in this thread rightly points out how simple it is to show the termination of a SPDIF line.  Measure voltage unterminated with scope, then connect 75 ohm load, and it should drop 50%.  The receiver can be measured by being sure it causes voltage to drop 50%, and the actual impedance can be estimated from how much it does drop.  This poster is somewhat concerned about jitter, pointing out that when an analog signal is turned into digital, timing is everything.  That's a somewhat weak argument IMO, but his better argument is that impedance, etc., does affect the digital signal slope, and therefore clock recovery.  It is not unimportant, even if downstream clocks and servos can mostly repair the damage.

No comments:

Post a Comment