Wednesday, May 4, 2016

Speaker Cable Resistance vs Inductance

Update:  I've found the Audioholics page where they actually measure (not just simulate or calculate) the high frequency response of several different cables.  As a result of this, I'm no longer going to argue that thinner zip cord has lower inductance or better high frequency response than thicker zip cord (as I did on the previous version of this page, based on an incorrect calculation).

This information is far better than what I previously had at hand, and I'm revising my conclusions.  (When you get new data which leads to different conclusions, what do you do?)

Their results indicate that:

The inductance of standard zip cords does cause measureable high frequency loss.  All zip type cables have significant loss due to inductance at 20kHz.  I was correct about these ideas.  However, the loss *is* sufficiently small, smaller than simple-minded calculations I was using previously, that I wouldn't much worry about it in 10 foot lengths and for gauges as thick as 10 gauge.  At 50 foot lengths it does become more important and I'd suggest 4-cross cables (as I used to always suggest for thicker gauges).   As standard 16 gauge wire was not tested, I can't directly compare, but it is looking like the high frequency loss in 16 gauge zip cord isn't better than 10 gauge and I'm getting a sense now that inductance actually decreases slightly as you go to thicker zip gauges, though I haven't yet found a useful table that compares multiple gauges of identical zip design.  (Previously I had the incorrect idea that inductance doubled up as you went to thicker gauges, and that was clearly wrong.)

What's going on?  The low frequency inductance doesn't increase as much as I had previously believed for the larger gauges, and peculiarly the skin effect works to reduce, rather than increase, inductance per se, and this may be a more powerful effect for larger gauge cables.

At ten foot lengths, the actual measured losses with old generation zip type Monster cable in 10 gauge is less than 0.1dB at 20kHz, and perhaps more pertinent the the increase in 20kHz loss beyond insertion (DC) loss is a tiny 0.02dB.  Even doubling these losses because of my 2 ohm load at 20kHz doesn't make them very important (the Audioholics tests used a 4 ohm load).  I believe standard 10 gauge zip cord would measure similarly to the old generation Monster.

What does look very bad, as I had believed before, is the cultist idea of physically separating the conductors.  Any significant degree of doing that causes large high frequency losses from the increased inductance.

Here's another interesting set of measurements, especially see Figure 4, Series Impedance showing the impedance curve of many different cable.  The zip cords follow a familiar pattern.  The heaviest zip have way lowest impedance at DC and low frequencies, but start curving upward in impedance earlier.  The "Fulton" cable tested I know is a very heavy zipcord, like 1 gauge, but I can't exactly remember, that shows these tendencies clearly compared with the 18 gauge and 24 gauge zip.  Of all the zip cord types, the Fulton has the lowest at every frequency despite starting to curve up earlier, mainly because it's curving up earlier from a much lower starting point.

Thinking about this does make me wonder again.  In a certain sense absolute impedance matters.  But change of impedance matters too, perhaps more in some cases.  I actually do think it's wrong when, as with the Fulton cable, the point at which the cable itself has many times it's DC resistance is still a fairly low frequency, and rising up from there, so at 100kHz and above it's just bairly better than 24 gauge zip.  Is this wrong?   I'm having trouble thinking about it...a huge change in cable impedance actually having little effect--precisely because it's so small in the first place, and essentially unimportant.  Do I care if the output of my speaker is 80dB or 80.1dB?  Not much, I can usually turn up the volume more if I want to.  But if the highs roll off by more than 0.1dB, I'd be concerned.  I'd like to see less than 0.3dB rolloff in any case.

With even 12 gauge zip, it clearly doesn't matter, there's a migh higher hinge point and not much change until 10kHz.

This may be the best way to think about it:  The cable impedance variation is important only as it relates to the speaker load impedance, not with regard to the cable impedance itself.   In the limiting case, if the cable had 0 ohms DC resistance, even the tiniest rise in impedance due to inductance would be, relative to 0, infinitely large change.  But that doesn't matter, what matters is change that approaches the load impedance as it approaches the load impedance.

Given a particular load impedance, say 4 ohms, what matters is an cable impedance which would cause a 0.1dB change.  A 0.1dB change is about a 1% change (1.158% to be more precise).  That means the cable impedance has to approach 1% of the load, or 0.04 ohms, to be significant.  Changes in the cable impedance below 0.04 ohms are unimportant, even if they represent a large proportion of change such as 1e-10 to 1e-3, which is a millionfold change, doesn't matter because it is still well below 1% of the load.

Now a thinner cable has more DC resistance, so it changes the baseline for assessing this change, but this doesn't have much if any effect.  Suppose we had a cable whose DC resistance caused a 0.1dB loss for DC.  It requires the same exact 0.1dB additional loss at high frequencies to be important.  So that hasn't changed, but what additional impedance is required?

In our 4 ohm example, a DC loss of 0.1dB requires 0.04 ohms of resistance.  How much additional impedance is required to get us to 0.2dB of total loss?  It's going to be slightly more than 0.04 ohms but not much.  About 1% more, or something like 0.0404 ohms.

Now looking the impedance curves of different gauges of zip cord, they seem to approach being the same at frequencies above 20kHz, with the larger gauges always having the lower impedance.  My sense then is that the larger gauges are always going to be better, having more extended highs, even though they have less "baseline" margin as described in previous paragraph.  The baseline shift doesn't buy as much as the added inductance losses of the thinner cables.  (I intuit this but I'm unable to prove it yet, and it might be close or slightly reversed at intermediate frequency ranges.)













 


No comments:

Post a Comment