Thursday, July 16, 2020

minDSP digital I/O adds clipping

I like the fact that I was able to buy general purpose IIR and FIR DSP modules for my living room audio system having AES digital I/O, which is the kind I use mostly because of flexibility, extensibility, and robustness.  Using AES digital I/O, I can have endless digital processors in series with essentially zero added noise and distortion.  Because AES is balanced, it is immune to being affected by ground loops, which otherwise can be problematic in large complicated systems.

These are the miniDSP OpenDRC-DI units.  ("DI" means "Digital Interface.")  My system uses 3 of them, one for the subwoofer, one for the electrostatic panels, and one for the supertweeters.  Each one is loaded with a plugin that is programmed to perform my crossover functions, currently using IIR type DSP only, but with a more advanced FIR rollout intended real soon now.  The FIR approach will permit me to have what is impossible in the analog domain: phase linear high order crossovers.

However, I have never liked the idea that these OpenDRC-DI modules use ASRC (Asynchronous Rate Conversion) digital interfaces.  In my analysis, this kind of interface is only preferable for the endpoints of a digital domain--the DACs.  With intermediate digital processors, like the OpenDRC-DI, the jitter at their inputs is converted to changes in the digital values (THD and IM) which cannot be removed from the signal later.  Meanwhile, the consequence of putting digital modules with synchronous interfaces in series is inconsequential.  Any number of modules in sequence is more or less equivalent to a single module, whose performance is largely determined by the clock of the source device and the transmission protocol itself, and which can be de-jittered at the ultimate endpoint (the DAC).  All the ultimate DAC has to do is approximate the clock of the source device, everything else in between can be washed out at that point (using buffering or ASRC).

I would have VASTLY preferred that OpenDRC-DI use synchronous interfaces and adapt to the sampling rate in the input signal.  This is what my Tact RCS 2.0 preamplifier does, and also what the Behringer DEQ 2496 which I still use for general purpose equalization and limiting and level/RTA displays.  I use both of these with full digital I/O so analog conversions are nowhere in sight (until the final DACs, which connect straight to the amplifiers).  The sequence of digital processors is now

Source Digital -> Tact -> miniDSP -> Behringer -> DAC

If I had my druthers, the Behringer DEQ's would have programmable FIR capability, and then I wouldn't need the miniDSP's.  But I have to live with the devices I can buy and afford because scratch building the thing I want would take years.  Or maybe I could just have one big box that performs all the functions of all of these devices, but it would have to have 3 separate DSP paths and 3 digital outputs for the 3 ways of my system.  I use the Tact only to select input devices and adjust system level and balance and test polarity for the whole signal, then the signal gets divided 3 ways by the 3 OpenDRC-DI units.

I believed the THD/IM caused by intermediate ASRC in the miniDSP would be nearly unmeasurably small, so it wasn't going to be a problem, I just would have preferred something different for intellectual/aesthetic reasons (I like things to be as "good" as possible, in objective terms like THD and S/N, whether I can provably hear a difference or not).

But now I have discovered a problem that is less than trivial.  Clipping !!!

[Update July 17.  Apparently it was wrong to blame the ASRC for the clipping.  It appears, in fact, that the miniDSP crossovers introduce a higher peak level.  It is therefore necessary to compensate the level for that, which is weird, but not the kind of nonlinear evil I was thinking.  These original experiments were wrong because I was testing the "no crossover" possibility by selecting setting #4, and in fact setting #4 was not set to "no crossover." Today, I repeated the experiments, but with laptop connected to the miniDSP to change the crossover possibilities onscreen.  I may post this next.  My earlier conclusions, unedited below, are therefore wrong.  And the crossover doesn't even have to be LR8 to have this problem.  LR4 does it just as well.  I still don't like ASRC, but I was wrong to blame the clipping on ASRC.]

I discovered this problem because the new steeper LR8 crossovers I am using allow me to play much louder than ever before without any strain from my speakers.  So I was playing one day close to 0dB (which means about 95 dB peak SPL, which is loud but not harmful), and then I discovered the clipping problem, which appears to be entirely caused by the ASRC.

I was playing Crime of the Century by Supertramp at  -0.6dB (on the Tact volume control) and noticed that the red lights on the Behringer DEQ for the midrange way were flickering.  Taking a closer look at the peak holding level display on the Behringer for the Source Input (which is the OpenDRC-DI crossover for the middle way), it was showing CLIP as the peak level.

I tried setting the Tact level to -2dB and still got clipping.  Then I tried setting the Tact level to -4dB and still got clipping.  Finally I decided that there was probably a maximum need for 6dB ASRC headroom, and I have never seen the clipping happen with the level set to -6dB or lower.  I believe that is a safe value, and I wouldn't want to use less attenuation because there might be some different album that pushes it even farther than Crime of the Century.

It may be hard for digital novices to understand how this is even happening.  If the digital audio data were transmitted without change through the system, this COULD not be happening (I think, though I  am not entirely sure of the consequences of using an LR8 crossover--that might be partly involved, although my experiments have already demonstrated it cannot be the entire story, as I will describe below).

What's happening is that the ASRC is "interpolating" new digital data points from the existing signal.  The interpolated points are in reference to the internal clock of the miniDSP, whereas the input signal data points are created by the clock of the source device.  The two clocks not only have small amounts of random jitter between them, they may also be at altogether different sampling rates.  This is how ASRC works.  The interpolation is not linear, but uses higher order (curved) functions.  So if you are interpolating between two points that are nearly at the maximum level, the interpolated point on the curve in between may actually be above maximum level.  Hence, clipping.

This problem of going above maximum level is well known (or should be) to the designers of DACs. Albums are often compressed as far as they can be compressed.  During playback, most digital to analog converters use oversampling and digital filters for reconstruction, and these reconstruction filters also interpolate between the input data points.  And, because once again the reconstruction filters are not linear, but curved, you may have interpolations that go above the maximum level.  These are called intersample overs, and are common in highly compressed digital music.  Well designed DACs are already designed to handle this.

But the digital transmission system is not designed to handle this.  There is a maximum peak output level which cannot be exceeded.  And that is equivalent to the peak level at "0dB."









No comments:

Post a Comment