Saturday, November 2, 2024

"Scoping" cables?

 A friend wants to know if it would be useful to examine high end cables on one of my oscilloscopes.  I tell him my equipment is inadequate for such purposes, it would be much better to have a 500 mHz or 1 gHz scope.  He incredulously asks me why such is needed for audio frequencies.  It's hard to answer that question in a phone text, so I'm doing it here.

There are a number of different aspects to this, so I may seem to go all over the place.

1.  Macro vs Micro parameters

Oscilloscopes and voltmeters measure what we could call macro parameters of audio transmission.  Such parameters could include DC voltage and risetime.  Basically things having to do with the large outline of the signal transmission: bandwidth and phase response.

Micro parameters like noise and distortion are not so easily measured with oscilloscopes and voltmeters.  Best thing to do is store the signal and apply an algorithm to compute them, such as the program RMAA (but you'd probably need something much better for audio cables...*)

Noise can come from many sources: internally generated and picked up.  How well audio cables accept or reject noise from EMI and RFI is in fact very important but often ignored.

2.  How do audio cables differ?

Decent audio cables don't generally differ that much in their macro parameters.  The bandwidth of a typical RCA terminated coax cable is around 10 Mhz or greater.  Generally you can use such cables to transmit composite video which has sidebands up to 6 Mhz (though good video cables will often have a bandwidth 10-100 times greater than that for the best performance).  I have often used regular audio cables for both video and digital audio and it nearly always has worked (though I generally like to get the proper cables, etc).

It's not surprising, therefore, that many engineers and audio scientists believe "audiophile" cables are not necessary.  There's no way, they might say, that you need more than the 10 Mhz bandwidth of typical interconnects.

But they aren't generally considering micro parameters.

3.  But I've seen scope comparisons of cables and they showed differences.  How is that?

I've seen such photographs too, though I can't remember the exact details, I remember once when I really checked it out they were using a very high bandwidth scope to resolve differences at very small timescales.  That means high bandwidths, and they were using 500 mHz or 1 gHz scopes.

Now it should be remembered that you always need about 10 times the scope bandwidth of the signal you are trying to view.  If you view a 10 mHz square wave on a 10 mHz scope you may see a sine wave attenuated by about -3dB, because that's how bandwidth is defined.   You see none of the harmonics that make the wave square.

The square sides of a square wave are created by a series of harmonics which in principle go up to infinity.  That's why they needed very high bandwidth scopes to see details of very small duration (representing high frequency effects) in "audio frequency" squarewaves. 

4.  But I'm sure I don't always get 10 mHz bandwidth.  I can hear the rolloff.

Yes, but such issues likely related to the input and output impedances being applied, in combination with either the capacitance (for an interconnect) or the inductance (for a speaker cable).  These macro parameters can serve as a low pass filter.

That isn't really a fault of the cable, it's a mismatched system.  Now audio cables are too short to show what are called Transmission Line Effects at anything like audio frequencies.   Such effects can be shown for audio cables running many miles, but not a few feet.

So audio cables are operating as a lumped sum system.  You can easily calculate what its response would be if you could accurately know the output impedance, input impedance, and cable capacitance and/or inductance.

So to "match" cables, for example, you would select two having identical capacitance or inductance.

Just by doing that, you've matched the macro parameters.  No need to use a scope.

You can do that on a scope, too, but what you are showing, strictly speaking, is not the performance of the cable itself but of the system with input and output impedances.  It would have be measured connected to the intended input and output equipment of interest.

I've measured many 'plain vanilla' interconnect cables and they tend to have between 11 and 60 pF of capacitance per foot, with 30 pF being typical.  That is not going to have any issues for most audio equipment, but sometimes audiophiles have equipment with unusually high output impedances or unusually high input impedances (in some tube equipment for example) where there might be issues.

With speaker cables, the impedances are all more critical, and I don't have any "inductance" meters.  A scope would be the way to see the macro parameters.  But with reasonably well matched impedances, you aren't like to see any differences among cables at audio frequencies.  The way that speaker cables interact with amplifier and speaker loads in through very tiny effects.  Say, the impedance may be 0.05 ohms at one frequency and 0.04 at some other frequency.  Against an 8 ohm load, that would cause a very subtle change in timbre, but not something you are going to easily see on a scope.  It's a miniscule, but potentially audible effect.

*5.  You're not trying to tell me you don't believe in cable differences?

I'm trying to remain open minded and non-judgemental as to what you or other people can hear, in writing this post.

Frankly I don't believe there are significant audible differences among decent audio cables.  You could probably get by with plain vanilla radio shack cables in most cases (and I still have boxes of them**).

If there are significant differences, they are most certain to be found among the micro parameters and not the macro parameters, because in the macro parameters such differences just aren't there at audio frequencies, you can't measure any such differences easily, which is why many engineers and audio scientists believe that decent cables are the same.

**Nowadays I try to go with what is objectively best anyway, just in case it might be important.  So I have PTFE coated speaker wires because PTFE has the lowest Dielectric Absortion of all solid insulating materials, and a number of other considerations.  I have also chosen my speaker cable gauge (according to some principles that are too complicated to explain here, but end up making 12g the optimal for twisted pair) deliberately. But are such things necessary?  I doubt it.  For most interconnects, I use BJC-1 from Blue Jeans because it has the lowest capacitance and best shielding.  It also uses polyethylene dielectric, which is nearly as good as PTFE, but that might not even be important.

Dielectric absorption, in which an insulator operates as a battery, is largely a low frequency and low level phenomenon, something you are generally not going to see on a scope unless you test then with DC levels and very high resolution measurements.  There is very little difference among the 3 currently popular materials: PTFE, FEP, and polyethylene because they are all so good.  They are largely unaffected by electromagnetic fields at the molecular level.  Vinyl is a crappy dielectric, but probably adequate for audio purposes, I used vinyl speaker cables until around 15 years ago.  

But one of the worst things about old speaker cables is copper wire corrosion, which most frequently occurs with vinyl cables.  Sufficient corrosion can cause distortion, potentially audible, but again not measurable on a scope.  (And maybe you'd even like the effect if you liked a "fuller" sound.)

6.  Haven't you heard cable differences?  Here I can show you right now...

Yes, I have sometimes heard 'differences.'  However not under the circumstances that I would consider made it a valid experiment.

A valid audio experiment includes these details:

1) Level matching of all sources (etc) to 0.1dB.  You must have perfectly repeatable audio levels, such as with digital controls or stepped attenuators, rather than continuously variable controls.  Tiny level differences (as low as 0.25dB in my experience) can change the quality of a sound without changing its apparent loudness.

2) Listening position and environment must not change.

3) Double Blinded testing should be done, where neither the experimenter nor the subject knows which is which, along with a suitable protocol such as ABX.  This is especially true if there is any amount of time passes between one audition and the next, because it is relying on a fading memory for comparison.  Repeat trials are necessary until the null hypothesis can be rejected all but one times in twenty (p < 0.05).

Without these controls, I don't consider any listening test to be definitive, but just a guess.  Peter Montcrief created the term Golden Eared Subjective Reviewer (acronym GESR) to describe many well known audiophile reviewers (not that he didn't have his own faults).

Listening tests done without such controls can be fun if people don't take them very seriously, as they should not.


No comments:

Post a Comment