Tuesday, May 23, 2017

Master 7 Triumphant!

The Audio GD Master 7 Singularity arrived on Monday morning, 5 days after being shipped by the factory in China.  It immediately vanquished all my fears and doubts.  Packed perfectly, handled perfectly, and smell free in every level of wrapping.  I didn't recall selecting Air Freight but if that was what this was, I'd do it again.

I let it rest for a few hours in the living room, then unpacked and set it up, then warm up with the power amp unpowered.  Upon unwrapping I immediately had to place it in it's designated location atop the marble slab (where the Audio GD Dac 19 Anniversary Edition had been until minutes before) because there's no other free space large enough in my home to rest it on except the dirty carpet.  When you get the Grand Piano, you have to live around it.  This impressive looking unit is as large as I imagined, but not quite as heavy as I imagined, though certainly no lightweight.  It's large because doing things the right way from end to end takes a lot of well engineered circuits and parts.  To get tiny, you need to make compromises.

Also immediately, it was clear that this was not the kind of problem unit I had feared.  The hoped-for combination of greater sweetness with greater transparency was obvious, though to appreciate it fully took a few hours, listening to many things I couldn't listen to for years, now revealed with such clarity and sweetness it became possible again.  There was never a sense of needing to "turn it off" even playing the unplayable, at very loud levels, and the often more challenging very soft levels.

My audio perfectionist friend agreed that having the voltage set to exactly what's required to drive the power amp to peak power is best, and not waste resolution on useless voltages.  And it seems that Audio GD did exactly as I had ordered, to achieve the optimal spectral balance I needed to raise the subwoofers and super tweeters by 1.5dB, the change from 2.5V max of the old unit to the 3.0V of the new unit.

The slightly astringent quality of the old system with the cheaper DAC is now gone, replaced by endless depth and richness.  What I wasn't expecting was how all the imaging became MUCH more solid and correctly located between and behind the speakers.  That's the virtue of dual mono construction and fully balanced operation and connection!

I haven't done any technical tests, not even measuring the voltage, but I don't think I have to.  It couldn't be THIS good without having nearly unmeasurable distortion, state of the art resolution, and so on.  And it wouldn't have this level of transparency if they had slipped me the NOS version by accident.

Truth be told I don't know how much of the improvement comes simply from using balanced inputs on the Krell, in my system especially.  The Krell isn't powered through the same conditioner as everything else, and though I've never noticed a ground loop, at some level there must be current flow through shield grounds, and the effect of that is virtually eliminated by balanced connections, among their many other advantages.  Mind you, with some equipment in some configurations it is unnecessary and perhaps even suboptimal, but in my system, balanced drive of the power amplifier was a too long overlooked requirement.  (Actually, I tried balanced operation many years ago when it was being power by Behringer DCX crossover, but I feared probably erroneously that it was leading to the excess heating in one channel--the problem that was ultimately fixed by getting the full Capacitor Service from Krell.)

But I also know now that 1704's aren't being used correctly except in differential mode, and better yet parallel differential, as all the big name DAC's from Levinson and others did back in the day they were still being built with the best PCM chip ever.  The potential of this chip is lost without differential operation.

So now, everything is being done right, and it sounds that way.

Sadly for the rest of you, this unit is part of a very limited production using the very last unused Burr Brown 1704's which Audio GD scoured the world for (becoming, ultimately, the biggest name in audio to rely on them, after Lite Audio bailed a few years earlier on making 1704 balanced units).  In future, and for replacements, it looks like 1704 lovers will be scavenging other old salvage units for the super special unobtanium parts.  Eventually, possibly, even better hybrid R2R technology (such as used by MSB) will trickle down to non-stratospheric prices.  Audio GD is trying their hand in that game, but it might be awhile before they can reach the same levels as they had with 1704's, let alone MSB.  Then again, I don't really know if even an MSB Platinum or whatever would sound better than the dual differential 1704's in my Master 7.

Anyway, it's hard to imagine something being even better than this, but that's what I thought about the Audio GD Dac 19.  But I've got what I need for now, I think.  (This was a kind of record setting audiophile purchase for me, I believe the most expensive audio component I've ever purchased brand new, and among things not so qualified only the Krell amplifier cost more.  So I hope it's enough for awhile, the forseeable future.)

I'm connecting the DAC output to the Krell amp with Nordost Baldur audio interconnects I bought awhile back at deep discount, and actually for making the Oppo BDP-95 to Lavry AD10 connection, but I decided I preferred antique Denon players with unbalanced outputs so I haven't needed it since.  The Baldur cables are thin and extremely "fast", which generally means little stored energy.  I don't really care how technically fast the cable is, delaying my experience by a matter of picoseconds is not of any consequence, but that's a good indicator for stored energy because stored energy, as such, is not so easily quantified.  Anyway, these are clearly well made high performance cables that are very transparent sounding, which is what I want.  I'm in no rush to find something better, and they turned out to be exactly the 18 inches required.  Compared to many audiophile cables, they are just good cables, nothing to equalize this way or the other, though you could argue the capacitance is a tad higher than necessary at 20pF / ft.  Belden 1800F does better, at 13 pF/ft.  This is of little consequence for me...you could calculate the low pass for a total of 30pF for my cable vs 20pF for the Belden, into 50k ohm load, with a 10 ohm source impedance.  This will be so high in the Mhz you might think the higher capacitance not such a bad thing at all, but anyway, there's another issue here, and that is simplification.  When they say about interconnects that capacitance is the only thing, they're wrong, it's the only thing that goes seriously wrong in ordinary cables.  In audiophile cables, many other things can go wrong.  But possibly, if nothing is done seriously wrong, additional things be done more right, the things of the highest consequence of all, and that is smearing or not smearing information.  And at that level, dielectric absorption is occurring through complex geometries at every point through the table, and interacting with the complex geometries of magnetic and electric fields.  This may or may not cause excess stored energy at various frequencies.  You could argue...these stored energy effects are at such high frequencies I don't care, I care about capacitance that rolls off at 10kHz.  But, when the capacitance isn't going to cause any rolloff until 5mHz anyway, why not arrange to have it balanced with the electric and magnetic fields not to store energy and therefore smear transients in any way.  Since the smearing can be a nonlinear thing, it can ultimately induce the possibility of something being heard differently, as say some tiny threshold is met either synchronously or not with some other.

The input connection is currently being made through the SPDIF input using two cables and a converter box, the latter being a HOSA which converts AES balanced to SPDIF, and the former being standard Canare and Beldon cables.  This is exactly how I connected the SPDIF input of the Dac 19 which had no AES input.  I've just ordered two appropriate-looking 5 ft AES cables now that I have an AES-input DAC: an Audioquest Cinnamon, and a Geistnote Canare.  The Cinnamon appeals to me because solid core conductors, and everything is silver plated, even the braided shielding.  Silver plating is absolutely what you need to do for best ultrasonic signal transmission, and AES is critically reliant on that.  Professionals may be more interested in the ability to resist stress fatigue of stranded wires.  The Geistnote is a souped up Canare (I got the extra special connectors too) and I like the extra covering which makes the wire run more straightly, which I think is preferable.  I think the Geistnote will be my backup cable, but we'll see.  I looked at all the pro cables and none used silver plating or tinning to protect the copper surfaces.  So, just at that level, I figured universally available Canare basically as good as Gotham, Mogami, or Belden.  As it turns out, I've most often used Canare AES, though my most recent special purchases were Mogami Gold and Geistnote, and I liked the latter the best.  Among the high end cables, Cinnamon is the least expensive with the key features including silver plated solid conductors and shielding.  It's far less expensive than anything else with the key features, and I have seen no other features worth buying.  I trust Cardas construction most of all, and Cardas used to make a 'plain' AES but now only seem to sell the high priced Clear.

AES has gotten a bad rap because of jitter.  Nowadays many will argue even coax spdif is better.  Coax has a peerless ability to transmit high frequencies...hence coax has always been used in radio.  The 110 ohm AES cable is not quite as good as coax.

But, and it's a big but, in a complex system there may be more significant concerns.  Even the tiniest of ground current flows can upset detecting the moment of signal transitions in an unbalanced low voltage signal.  So AES is probably better if you are doing more than just one digital connection.  That was what it was engineered for, and, as it turns out, kinda what I am doing.  AES works well for me with my pro-audio DSP's, sampler and DACs.  And done rightly, the differences in jitter are not going to be a big deal compared to "the ultimate" I2R, which I cannot imagine when I will be using if I ever can.









Saturday, May 20, 2017

10 analog inputs!

My living room equipment rack now has 10 analog inputs, after my having piggy backed the 5 input dB 5 way selector (with Teflon jacks...the upgraded version) on top of the main selector I have been using, recently modified for zero load and zero pass resistance through the tape outputs which I use, the Aragon 24k, which itself has 6 inputs but I must use one to connect the second switch.

Ahh, preamps have never had enough inputs...and now for some time the "preamp" itself has been obsolete or something, though the impressive preamps nevertheless being built approach unimaginability in previous less high high end audio.

Having lots of analog inputs is freedom, just like having analog inputs at all.  It is freedom I cherish.  Digital connections come with policies, DRM, and hidden stuff like jitter and garbage.  While somewhat jitter prone, SPDIF and AES are at least free and open and I like them, but Analog is even better, Analog is free and pure.  Even when generated by digital devices like DVD-Audio players.

Anyway this finally enabled me to make a "permanent" connection of my two tuners to the input hub where the selected analog input gets converted to 24/96 digital by an amazing Lavry AD10.  Everything needs to either be digital or converted to digital to be used in my system, which is based on mind boggling DSP.

Somehow, I don't know, but it seems like liberating the analog signal from various disc players and reconverting that to 24/96 digital sounds better than using the raw digital, in every test I've ever run, and in complete contradiction of (reason?) established practice by virtually everyone else.

Anyway giving the tuners this super high quality AD conversion rather than the pedestrian 16/44 conversion through Sonos, clearly wakes them up, they are sounding fantastically better.  And now I notice finally the Jazz station I've been supporting for years has finally cleaned up the analog distortion they were adding to their signal in large amounts.  I'm loving FM now, more than ever though the 24/96 conversion.  I made numerous previous attempts to do this, btw, which ended in failure.  Back in earlier times, before my rack of last year, back when it was hard to get behind stuff to make even the most basic connections.  (And, btw, a preamp or selector switch amalgamation NEEDS a shelf all it's own, so the "permanent" connections can be constantly revised, as it seems they often are.)  Back then, earlier attempts resulted in motorboating sound from the tuner tuned to the classical station, and just plain distorted (even more) sound from the equally superior tuner tuned to the Jazz station (each connected to the better antennas for each).  But now, apparently with cleaner wiring and the ability to reach behind the tuners and properly adjust their variable outputs is making this work, finally.

I need to have whole house audio based on 24/96, with unlimited analog inputs, and access to libraries and streams (always, all commerical products put digital libraries and streams ahead of analog, at least Sonos grudgingly provides analog inputs, that's why I chose them and have stuck with them so far, but it's been clear for a decade that I need to make my own no-compromises system to do this, and nobody else does).

And yes I think a decent selector or preamp should have 10 unbalanced analog inputs, and at least 5 balanced inputs, balanced output, and simple switched polarity converter taking advantage of balanced output (and assuming the destination device has real balanced input).

There's never enough.  It's always the unconnected things you need to get connected, and the "reference" connected devices you've grown tired of.




Thursday, May 18, 2017

Measurements of a Singularity 19 DAC

These measurements do not look good at all.

When I get my Audio GD Master 7 Singularity, I will be measuring it!

(Previously, I didn't think I needed to, after all, my DAC 19 is perfect, or so it has seemed...)

I'm glad I discovered Super Best Audio Friends also...they're doing a lot of what I hoped to be doing: measuring and publishing.  (It seems more often I do one or the other but not both for some reason.)




Linear vs Minimum Phase

Here's a discussion about linear and minimum phase filters for digital to analog reconstruction.

The admin Ultrabike argues for linear phase...they produce the least phase distortion in the passband. He also argues away the obsession about pre-ringing as nonsense.  I think he makes a pretty good case.  The steepest and most linear phase filter is the most accurate one and the most transparent sounding (described by some as sounding hard, cold, "digital" or whatever).

Of course merely specifying linear phase says nothing about the particular rate or frequency involved.  Linear phase filters can be made to sound relatively warm just like minimum phase or NOS by making them slow.

This is strange, but apparently the more ultrasonics there are, the warmer things sound.  This "warmth" is mostly pseudorandom phase distortion, and that would be no surprise to electronic musicians.

My forthcoming Audio GD Master 7 Singularity Dac will have several Oversampling (OS) options, 8x, 4x, 2x, and (non-OS) NOS, and then combined with the NOS features of the NOS series of DAC's, which seem to include at least two different NOS flavors.  Previously these options were not available in the same DAC unit, but the latest firmware combined with recent hardware upgrades allows them to be.

With all this attention to NOS, I've been a little afraid the designer wasn't focusing enough on the OS designs, but at least he says the OS hasn't changed from the previous OS version(s) of the Master 7, which have been highly praised, so it won't be a step backwards.

From what I've seen above, I'm now not so worried that he doesn't seem to include minimum phase or "anodizing" options in the OS.  Plain old linear phase at 8x OS looks to be as good as anything.

I still wonder if Denon wasn't doing something special also with their AL24, which would appear to be some kind of up sampling, and how that relates to the above.  I'd always thought CD's sounded nice through Denon AL24 machines, even the ones using sigma delta conversion like the DVD 5900, and that was the first player in which I experienced superior sound from resampling the analog outputs rather than passing the digital directly to my DSPs.



Sunday, May 14, 2017

HPM 100

My second unit, not plagued by bad drivers, sounds ok.  My first had two bad drivers, my NOS super tweeter was perfect but my replacement ebay midrange was worse than nothing: scratchy sound (which you can feel, moving the driver by hand).  The availability of even decent looking HPM 100 midranges being something like nil, I decided go try an HPM 700 midrange, said to be physically compatible with the hole anyway, and possibly improved or better.

Many tricks here to get a loud playing speaker.  The first, as always, is not to actually play loud, such as in the bass below 40Hz, where by 33Hz I'm not even sure if there is even "useable" response--it's there but so far rolled off.  Well this bass rolloff means that you don't have to worry about the mega excursions that would otherwise be required to reproduce 30 Hz at 110dB or whatever, let alone 20 Hz.  Actually even by the tuning point around 40 Hz there's already quite a bit of rolloff and around there anyway it isn't the speaker moving anymore it's the port. 

A modern paper purist would argue plain old paper cones, with their natural damping, would be best of all.  True, perhaps, but it wouldn't have that 'monitor' sound.

But I think maybe the midrange could be rethunk, certainly on the first HPM 100 it attempted to cover too far a range, leading to lots of burnout.  The A version limited the range to like a few Hz in the midband, that's actually not such a bad solution, the woofer can support higher, just not quite high enough to crossover with the tweeter.  A modern high efficiency driver might do the improved narrow band even better.

I'll leave my good unit as reference and the other as testbed for new drivers and crossover ideas.  I like the narrow midrange crossover idea actually (contrary to many).




Too Loud

Even the tiniest step above "real" is "too loud."  On first brush, it might seem as things get louder they get more and more immersive, you hear more and more.  But the opposite is true, beyond a point, a point even exceeded periodically in live classical music, the audition system begins to close down, like an armadillo, and one can actually discern less and less, only the stress of loudness.  Which can feel bracing, I suppose, to some.  But it's a low information thing, one big bit.

But I sensed this phenomenon hearing a demo of the Linkwitz Orion system on some fairly pedestrian electronics (22 series Marantz) playing MP3.  It was about 10 dB elevated compared with my loudest listening, 90-100dBc instead of 80-90dBc.  I sensed a loss of information specifically from the high loudness.  Of course the source material probably didn't help.  I wasn't made to feel my going electrostatic was a waste either, though the Linkwitz were clearly top shelf for dynamic speakers.  (Let's say the HPM-100 I bought very cheaply are something altogether different--a chorus of wheezes rattles squeaks and buzzes which somehow combine to form something vaguely carrying the tune of the original.  Update: my second unit has good drivers and sounds ok, and it plays loud, but I have mixed feelings except for my intended purpose as garage speaker.  Anway, There's a strange magic in how louder can sound better louder than softer beyond the point where that happens in other speakers because as they play louder, the stiff drivers become more dynamically responsive to all frequencies impressed on them magnetically, a kind of "opening up" that invites cranking up.)

But I'm getting the same sense (and moreso) from the HPM 100.  In their present evolving condition (quite a bit more wheezing and buzzing than normal, I think, due to somewhat shot drivers if nothing else).  Even ignoring any of the design or maintenance problems of this speaker, which are multifold, I can tell now that there is a too loud, and actually it's not that loud, but more like where I have generally been gravitating too, even with my unwarranted fear of digital gain.

Symphonic music occasionally throws in a sustained section of too loud, perhaps even topped by even louder boom from the drums.  That can be in the 90-100dB range with the final boom hitting 106dB or more.

But it's not always like that, not even mostly.  The median level is more like 82dB.  That's where we get the depth, sweetness, and lyricality.  The loudness elevated sections are to give the stress and passion, or sturm and drang as they say.

It's true, in some kinds of music, rock n roll and derivative and similar popular music is all about the stress and passion.  There is no sweetness.

But since we're making it up anyway, though you could argue a rock concert is "real" it isn't actually any realer than a home or studio performance.  A rock concert is a temporary drunk, listening at home is life.  So it seems to me even the all passion all the time music should remain mostly below 100dB.  And I think I'll be happy in the higher information retrieval range 75-90dB even for Rock.

That's my 5 min. assessment.  And I was going to spend years investigating this.




How I learned to stop worrying and love digital gain

Sometimes, digital sources need to have their volume raised.  This may be less true if you simply have a transport sending digital to a DAC.  But once you have fancy multi-way and multiroom systems combining equipment from many rooms, and analog sampling locked and now unlocked formats from classic drives (I don't know why, but resampling the analog outputs to 24/96 always sounds better than taking the digital stream off the device, even though it goes into a digital connection either way), you often need to do a little boosting, sometimes just to restore original recorded levels, or in the case of home made recordings from analog sources especially, to exceed them a little bit.

But once I begin cranking the digital volume control above unit gain, 0dB, which on the Tact is 93.9, I get a little worried that digital clipping might occur.  This would occur if the digital signal sent from the Tact onward to the DSP processors reaches the clipping point, which is 0dB.  Beyond that point, I didn't know what was going to happen.  Perhaps dropouts from a validity bit being set.  These dropouts could be far more annoying than typical analog clipping, even on a solid state amplifier (and the "sound" of solid state clipping has been rarely heard or correctly described...in a well designed Class AB amplifier it will not be spitting or overly harsh...just a growing soft harshness.  Anyway, not knowing the effect of digital clipping, it might even be some terrible out of band spiking, for example, of the kind that nearly fried my ribbon tweeters when connected to a 250 Amp (that was actually using the 10V RMS analog outputs of a DCX crossover, but the reason why it was outputting so much in the first place I don't remember and feared was in the digital domain).

But anyway, I didn't really know, because I had never tested it well.  And I had another related issue.  Some months ago I discovered Sonos was boosting volume levels so that "max" now had +8dB digital gain, causing premature clipping.  Perhaps this would not affect people using the analog outputs or something, I don't know, but it seemed a misfeature to me.  I took the precaution thenceforth from keeping all Sonos level at -8dB, which seemed to be the new unity gain.  Even fixed output setting seemed to have boost.  Well some time ago I wasn't sure it was still like this...volume levels were seeming softer than before.  I needed to retest.

I'm happy to say that as of today, my Sonos system passes closest to unity gain turned all-the-way up, and on the fixed setting.  It doesn't add extra boost, just the real deal...well almost.  It seems, according to my test, that a 0dB SPL signal is reduced to -1.3dB.  I'm not sure why that is, but 1.3dB of loss of dynamic range is not a huge concern, ,just a minor gripe really, but if I could have my pony I'd have full bit transparency, which I thought Sonos used to have back in the day.  I could be wrong but multiple tests showed the -1.3dB loss with Sonos "fixed" setting (I've set several back to fixed now, no need to keep fiddling to find the most transparent point) on my Behringer DEQ meter, ,and I needed to crank the Tact up to 1.3dB to reach the highest unclipped point, with clipping starting at +1.4dB but barely visible, and so on, at the output of my DAC, which I believe to be faithfully reproducing what it receives from the DEQ, and comparing that with the Triumph analysis of the WAV file.

Anyway, boosted back to normal and fixed, Sonos is now sounding good again.  That's at least two major advances in the past few days since my return from vacation, the first being un-attenuating the Sony 9000ES.

Cranking up the Tact beyond the transposed clipping point at +1.3dB, it simply clipped the voltage rise at that point, flat, pretty much like a transistor amp, with no artifacts at all.  And that continued as I drove it into "hard" clipping, it just made the wave slopes steeper, meeting a flat top at the same point, as I had speculated earlier, boosting the RMS voltage but not actually raising the peak voltage.

Through any of my DEQ units, the clipping would show the effect of the EQ on the waveform as it clipped, looking a it more curvey for example, but still not too bad at all.

Now, sending an unclipped signal at 0dB into a DEQ unit, and then attempting to clip it within the DSP program had a different effect.  Clipping through the DEQ resulted in a compressed output that showed no sign of clipping, just no increase in size.  I'm a bit worried about this, perhaps on real music it would cause peculiar effects more annoying than the slight harshness of peak clipping.  Or perhaps not.  But either way it didn't look any more scary and probably less scary than analog clipping.

So while you can argue that cranking up the digital volume should not be much necessary, better to have the source material reaching peak 0dB or close to it to preserve resolution.  But when you need to crank up the volume, doing so with digital gain isn't any more dangerous than doing it in the analog domain, and probably safer, with regards to level changes that may later cause clipping.

This adds to the case for mapping the output of a DAC driving the amplfier to have exactly the output required to drive that amplifier to clipping, or slightly more, rather than 6dB more or so.  The added affects of digital and analog clipping around the same point should not be bad, if very rarely reached. As I have previously argued, this preserves the dynamic range and resolution better at every level.

Similarly, I've now concluded that adding boost in the DEQ's, all of the DEQ's, not just to compensate but to create more digital gain, is not a good idea, as it shortens the actual digital mapping space just like having a DAC with too much output.  The best gain setting on any digital device is 0dB or as close to it as possible, to preserve dynamic range and resolution.

Pictures later.



Saturday, May 13, 2017

Playing Loud Enough: The DAC story

Immersed in an overwhelming sea of auditory information, I spin tales about my "force of nature" audio system, then whip out my phone to measure it, and it's showing only 65dB. What???  "Loud" should be 100dB, or at least 90.

Firstly, I must always remember to set meter for "C" weighting when measuring loud.  When the level is cranked up, we don't need the audibility curves of the "A" weighting.  When set to C weighting, my measurement goes up to 75dB.  OK, when the ambient noise is as low (around 10dBA) as it gets at my house at night, 75dB average music power can be enveloping.

Second, I can easily crank the level up to 85dBC peaks.  Beyond that, I sometimes have to use the digital gain of my digital preamp, which increases the digital signal above what the preamp is given (just by multiplying the numbers by some factor and dithering...).

Because I fear digital clipping worse than analog clipping, which I probably shouldn't, I have been scared to use digital gain (which might cause clipping later when the music gets louder) more than excess analog gain which many systems naturally have.

The way I stream (through Sonos, which has digital volume control, which has sometimes been turned down slightly) and digitally resample all other sources means that peak levels in the digital signals reaching my preamp are usually well below 0dB even at peak levels, so they can harmlessly be boosted back up, as long as it isn't done too far.

But also, hidden digital volume controls should generally be set to unity gain (which is often, but not always, maximum) and maybe I have to adjust my resampling.  I've been studying my resampling for the past few weeks.  It seems I can set the Lavry AD10 to maximum gain on all normal CD sources, played on the Denon DVD-9000, the Onkyo RDV-1, or the Sony 9000ES, without clipping the Lavry.  In fact it seems to leave about 4dB or so of additional headroom above the nominal 2V outputs (I haven't actually measured them, but digital sources described as "2V" typically put out 2.15 or even 2.2V).  When I'm playing HDCD's on the Denon, I need to make the Lavry less sensitive to allow for a 6dB higher level.  I've taken to the idea of adjusting the Lavry to max gain when that is permissible both to get the highest level and the best sampling (though some earlier ad hoc findings suggested it was better set to somewhat lower gain than needed for greater headroom but I now know all my players already allow about 4dB headroom at max gain when only playing CD's.  It was measuring CD output that I discovered the 9000ES was 10dB down, which couldn't be right, and then I discovered the Audio Menu in which the Audio Att had been turned on.  I wasted no time in turning it off (which is the recommended and supposedly default position).  This has radically changed my ideas about the 9000ES as a CD player: it's a wimp no more.

I also got to thinking about how playing through a DAC reduces my possible level.  My Audio GD Dac 19 puts out 2.5V, which is more than the usual 2V for a single ended DAC, but that still isn't high enough to drive the Krell FPB 300 to maximum power, even though the rated sensitivity of the Krell is 2.35V (for 300W output).

This week I finally sat down to figure out how much DAC output would be sufficient.  Basically you want to reach the "peak continuous" power of the amplifier, or in other words the peak power specified as if it were continuous (RMS).  Handily Krell specifies this for the FPB amplifiers as "RMS Output Voltage," which is 60V for the FPB 300.  That corresponds to an RMS power of 450W (60*60/8.0) into 8 ohms.  This has come up in reviews as the clipping power of the Krell, Martin Colloms measured 470W into 7.5 ohms (slightly higher because it's 7.5 ohms and not 8).

There are several reasons why the clipping power is greater than the rated power.  For one, the Krell specification is conservative, all new and older units should be able to reach the rated power with some margin.  For another, the rated power is at a specified low value of distortion, "clipping" is sometimes (e.g. Stereophile) defined as the onset of 1% THD.  And for another, the unit might not be able to sustain the clipping power for long because of thermal or power supply considerations.

In fact it is only because the Krell has a regulated power supply that the Peak Voltage and Clipping Power line up so nicely.  With unregulated power, the peak voltage might be considerably higher than even the continuous power for for than a cycle or two.  For unregulated designs you might not be able to measure the peak continuous power at all, you would measure the actual peak and divide by 1.41, or you might take the rail voltage and subtract minimum junction voltage drops.

Anyway, when a DAC or preamplifier is driving a power amplifier, if reaching maximum loudness is desired, you must provide the input necessary to reach the "RMS Output Voltage" and not merely the rated power.  This specification, which might be called maximum effective input voltage, can be computed in various ways, such as by the ratio of "rated power" and "peak continuous power" and the rated input sensitivity:

Sr(V) * sqrt ( Pcp(W) / Pr )

2.35 * sqrt ( 450 / 300 )

2.87 V

So *that* is the voltage the preamp must produce to drive the power amplifier to it's actual maximum.  Anything less is robbing me of potential peak power, the ultimate peaks which are the stuff of music.

The 2.5 Volt output of my Audio GD Dac 19 comes close, but not quite.

Meanwhile, there is also too much of this driving voltage thing.  Especially with a DAC, if the DAC voltage ranges from 0-5V and the amplifier rnages from 0-3V, the last 2V, which represents 40% of the coding or mapping space, is lost to technically useless headroom.  So the resolution in the lower 60% is now diluted by exactly that much.  Another way of looking at this is in the dynamic range.  If the top 4dB is wasted, that means the relative ultimate noise level is now increased by 4dB.  But this doesn't only matter if you're playing at such a low level, the resolution is lost at every volume and appears as added noise (so low it's never noticed directly).

You could argue this dilution always occurs, even when you are simply digitally attenuating.  As it turns out, both analog and digital attenuators lose resolution, and about by the same degree.

Anyway, when the noise level is -120dB, and you're reducing it to -116, well you'd hardly notice.  Believe me I used resolution reducing (and noise increasing) digital attenuation destructive in my early years of digital EQ, causing far more loss than this, and didn't notice until I stopped to think about it.

I think most people with hybrid analog/digital don't think about it probably enough.  Surely I haven't at times.  Now you might say, I'm over obsessed about it.

Anyway, I though very much about it this week.  Thinking the desired voltage to be about 3V, my mere 2.5V was missing the mark by a crucial 1.5ddB, the difference between say, 340W and 450W, the ultimate peak power of the amplifier.  I pay big bucks for an an amplifier with all that peak power, the stuff of music, and then just throw a bunch of it away.

Got to fix this problem, I said.  I could get a quality preamp to follow the output of the Audio GD DAC 19, and boost it to 3V RMS.  I could get a used KRC, for example, and have it drive the Krell balanced from the single ended output of the DAC.  That's about the minimum quality of device I would consider.

Now this might be worth considering in a 1-way system.  Then I could use analog volume instead of digital volume.  Arguable analog volume is better, though digital volume has lower noise, typically -144dB since digital volume is done on 24bit number (using even higher bit math with dithering).

But in a multiway system like mine, the remote volume would be useless.  I would nearly set the preamp to the required gain of 1.5dB and leave it there.

It took barely a moments thought to feel how monsterous it would be to add all this complexity just to get 1.5 dB more output.  I'm not that big on simplicity, but this seemed monsterous.  Then, I could build a canonical OPA 211 based buffer, but the work is still unimaginable to me, making boards and doing the fine soldering, and it would all have to be to the highest standards to be good enough, including the separate power supply.

This crystallized with something else, and the next thing was I'd done something barely thinkable, I'd order a multi kilobuck DAC.

The other thing was I was testing the output levels of all my disc players through the Lavry.  The best way to make the measurement of output is by reading the "Peak" level meters of the downstream midrange DEQ reading input digital level.  I found one of the highest peak levels I'd ever seen playing the Reference Recordings disc RR-82, with Mephisto and other, on the Denon DVD-9000 with HDCD.  Track 9 has the highest level seen.  I had to set the level on the Lavry to -9dB (the level for CD's is the max gain level -13dB, which is 4dB more sensitive).  This ultimately left me with 0.3dB headroom, as determined by subtracting the -2.0 digital gain on the Tact (tact level 91.8) from the -2.3 dB peak reading in the Right channel (the other channel read -2.7dB).

Playing the disc on the Onkyo, which has no HDCD decoding, yielded -7.1dB and -6.9dB peaks, back to standard CD level and even that the Onkyo plays about a dB lower than the Denon.

In further testing, the un-attenuated Sony 9000ES appears to have the identical output levels, within 0.1dB, on regular CD's as the Denon.  The Onkyo has about 1dB less output.

In playing several CD tracks on all players in a row for doing peak level tests, I quickly had a favorite: the Denon DVD-9000.  In its new un-attenuated form, the Sony 9000ES is #2, and the Onkyo--my heretofore standard on all the things it plays--ranked 3rd.  Of course I made no attempt to equalize the 1dB lack of volume from the Onkyo, so this "finding" is as suspect as anything, let alone the fact that most listening was done from the floor as I was reading the peak levels.

Anyway, I concluded, and combined with previous knowledge, that differential 1704's are far better than single ended 1704's for digital conversion.  The Denon has differential 1704's, and that may help make for the smoother sound, silky smooth, compared to the fine stainless steel granularity of the Onkyo.  I know that the Burr Brown PCM 1704's are not perfect, though they are the best ever implementation of 24 bits of R2R non-feedback PCM conversion.  They are not the most glitch free, the never ending popularity of the original digital chips, the Phillips 1571A, stems in large part from their relative absense of glitches.  Anyway the 1704's shine in differential and dual differential implementation where the glitching mostly cancels out.  That is seen in all the big league units, not just my rarity DVD-9000 but in the best Levinsons like the 360S, the 30.6, and so on.  I looked at those briefly, the more common 30.5 uses the earlier 1702 chips which some preferred but are only 20 bit chips.  I'd consider those.

Well this brought me back to how much I would prefer to have the top Audio GD DAC, the Master 7.  It has the ultimately perfected dual differential 1704 implementation.

Designer Kingwa of Audio GD is one of the great audio designers, up there with Curl and Pass.  Sure, he's a self promoter, a cult of personality, a huckster, but they all are.  Like the good ones, though, he seems to do the hard work.  He uses good stuff, and puts it together to achieve the very best results, with maniacal attention to details and the perfection of each part of the circuitry as well as how they all work together.

Because of how he attends to the details of how well his circuits actually work, rather than just slopping something together that barely works (such products used to be legion) he gets to the maximum linearity achievable, so so latest Master 7 DAC is rated as 0.00005% distortion and he shows the spectrum.  At first I was put off by the obvious -110dB peaks.  But -110dB is 0.000003%, more than 10 times less than the 0.00005% specification.  And he does this with no feedback, no capacitors, direct coupled discrete fets and possibly more regulated power supplies than Levinson, at a somewhat attainable (gasp) cost.

By comparison, the PS Audio DirectStream DAC (which might have some merit...I currently hold the idea that true one bit conversion, highly oversampled, has better sonics than delta sigma) is only specified as 0.03% distortion.  Sad to say, that looks like lazy engineering (though it could also be very conservative specsmanship, in the day an age when subjectophiles believe the higher the distortion the better).  And we're up to the big bucks, paupers need not apply, if not even close to the swiss or german prices.

The actual only other possible choice to me was the Schitt Yggdragasil, which uses modern PCM DAC's in a proprietary "closed loop" solution--ultimate performance though advance calculation rather than sigma delta feedback.  That has always appealed to me, and may be actually better or worse, I don't know, but Stereophile something that looked like digital-related distortion and was unwilling to specify the "effective bits", and the AD chips *are* only 21 bits, not the 24 bits of the legendary 1704.  There's a long argument here, but I believe if the "bit loss" is merely thermal, or sufficiently random, or whatever, it doesn't count as badly as not trying to approximate the bits at all.  This is related to my view of noise and resolution as being somewhat different things, ultimately.  So it counts to go after the 24 bit resolution even if you don't achieve it.  So I've had a strong leaning toward the actual 1704 products like the historic Levinson DACs.

But putting all these considerations together, the solution was clear for once.  I should finally order the Master 7, which I'd long desired, and have it modified for a 0-3V range instead of the 0-5 normal production.  If I can even still get a Master 7.  It has almost seemed every year Kingwa says there will be no more, and I believed him years ago.

This had become a much more important track to future progress than buying a mega expensive preamp, which had been one of my hopes for this year, something like a Levinson 326S, or a 32, or, just for starters, a Krell KRC 3, to boost the DAC output, or boost the CD player inputs (this seemed more crucial before I turned off the attenuator on the 9000ES).  I'd be getting the much improved dual differential 1704 configureation, and true balanced outputs which work best with my Krell FPB which has true balanced circuitry throughout and benefits circuit-wise from balanced connection above and beyond the benefits of balance on lower random noise induction and AC chassis ground differences--a not inconsiderably consideration since the Krell is plugged straight into the wall whereas the DAC is plugged into a special strip plugged into the power conditioner.  Balanced connection means the ground differences virtually don't matter.  It's a wonder things have been working so well unbalanced!  I shoulda swung for balanced in the first place (though, for what it matters when I got the Audio GD DAC 19 I was using the Aragon amp, which doesn't have a true balanced input only a virtual balanced adapter to a single input.  That doesn't really gain an advantage from balanced input in any way.

I checked the website, and saw it's now the new Master 7 Singularity, which Kingwa mightily emphasizes is the ultimate end of the 1704's, the last 100 units can be made and that is it.  It's supposed to be even better than all the previous Master 7's, especially in the new digital, but I'm not sure when or if I'm ever going to be able to use I2S and likewise even USB.  An AES connection, my first choice requirement (I was using an AES to coax converter with my DAC 19) is $15 extra.

I sent of the email, requesting my desired output voltage.  I doubted this was possible, thinking the Master 7's already all made by now if not spoken for, there might not be any such choice, and the options page didn't suggest any such thing.  I was thinking to myself, I really set myself up, if the reply is they will do this I will go ahead, this being my last chance most likely to get DAC with the correct custom output voltage range, and dual differential 1704's as well, and top performing in every way.

It didn't take long, and it doesn't even seem like I'm paying anything for the voltage change.

Well this was a lot of money to swing, for me the guy who not just a few years ago was saying DAC's don't matter much.  So I've had a lot to chew on, for the day before making the order I was in a near panic trying to ensure my custom voltage range idea was actually a good idea.  And I was worried sick for several days after that it wasn't, or perhaps I should have specified 3.3V or 4V and so on.  (4V was the balanced output of the historic Levinson DAC's I've lusted for, PS Audio offers 3.15V and 5.3V.  The loudest DACs will almost certainly in many cases be judged the best, so everyone makes theirs a tad louder than the previous crop, not for the best reasons perhaps.)

The "wasted" extra headroom of a DAC isn't entirely wasted if it allows the user to get additional useful gain they otherwise wouldn't get.  But I am not such a person, I have at least 6.1dB of available digital gain through my Tact digital preamp.  I also have also been ignoring the up to 15 dB of digital gain available through my Behringer DEQ units used as crossovers and shapers.  Though I've actually already been using some of that, +7dB in the supertweeter unit, so there is actually "only" 8dB of additional digital gain available.

I've only been fearful of using too much digital gain.  When that hits, I worry about digital harmonics being generated, huge horrible clipping noises and so on, the native clipping might be tiddlywinks in comparison.  I don't know...and that's still exactly the problem!  I should find out exactly how bad digital clipping is, anyway, since it's so easily attainable (and attenuate the levels so as not to produce amplifier clipping...which is more frightening for doing tests on, particularly when you have an amp rated for 1600W of power delivery).

My first brush is that it may not be bad at all.  It was, after all, way overvoltages (the 10V output of the DCX driving a 250W amplifier driving the Elacs) that nearly fried my ribbons.  It was not, I don't think, digital clipping as such.

Anyway, I also needed to be sure, right away, that 2.5V was a true RMS voltage.  It was, in fact my measurement was that it was putting out 2.67V RMS for a 0dB signal...ah yes the slight boost over 2.5V "nominal" it always seems to go that way, and to be fair when they say 2.5V that's a value they intend to reach even under cold temperature, etc.).

But then that raises multiple questions about a simple "3V output" specification...and the very first being...why buy a new DAC at all!!!  With 2.67 I was already getting very close to the 2.87 I calculated I might actually need.  When I was putting the issue as "2.5 vs 3.0" it was 1.56dB difference, not inconsiderable.  But the difference of 2.67 to 2.87 is a mere 0.62dB.  Is that worth paying thousands of dollars for?

But: the balanced outputs I needed from the beginning, the dual differential 1704's are needed, the balanced AES input nice too.  This is, as I said, getting late in the game to get a machine like this new from the manufacturer to custom spec.

OK, then, if one really needed a boost so much, it must have been more than needing the last 0.62dB. It must have been the lazy analog gain of driving the amp with higher output, as I was doing with the old 10V RMS output DCX units before I started using 1704 based DAC's.  That makes it easy to get high output, it requires lots and lots of attenuation, like running the Tact it the "70s" range, to get normal levels.  I used to do that, throwing resolution to the wind.

But 5V is long around the 4V, or 3.5V or whatever I figured would be optimum.  I never before did the actual calculation.  And honestly the calculation might be wrong...but it could be wrong either way.  But say I could use more output to raise the voltage somewhat if with, say, less than 10% distortion?  Should I do that?

It's a tough call.  I do wonder if I might do better with 3.3V or 3.2V than the 3V I asked for.  Though it now looks like I'll probably get around 3.2V measured due to the nominal thing I discussed before. It's almost certainly not going to be less than 3V under any circumstances.  Distortion is at least audibly going to be kicking in about then.

I was actually going to ask for 3.2V...though figuring I'd get about that anyway.  But then, at the last minute, seeing the 3V nicely printed up on the invoice, I decided to just go with that.  I think I'm right up there with the peak output of the amp, might be some benefit from being able to go a little further to 10% distortion perhaps, and not much resolution wasted to worry about.  It's often said that it's the onset of analog distortion that makes people think things are too loud, or maybe even just "loud enough."  That's what I strongly believed when a friend of mine kept wanting me to crank up "White Wedding" louder and louder.

I've given up the lazy extra gain of 5V that I think many audiophiles just use, straight into their power amps, without thinking what it does that way.  It's a convenient way to get more volume without fancy DSP devices.  Anyway, I've given up that easy road for an idea, that using digital gain to push low level inputs is better than wasted headroom.  One thing about the lazy gain is that you end up hitting analog clipping, which might be sort-of OK, happens all the time, while the digital clipping that results from using lots of digital gain when the peaks are higher than anticipated, might be destructive.  I suspect now, that it's the other way around.

A funny thing happened when I started cranking up the digital sine wave above 0dB.  I could only tell that I was at 0dB looking at the reading on the peak level indicator for the digital input to the DEQ.  I got to 0.1 peak level, then advanced -0.1dB, and it stayed at -0.1.  I kept advancing and it continued to just stay at -0.1.  So obviously 0.1 was the highest level I can read, or virtually 0dB.  Perhaps the -0.1dB I get after already getting -0.1dB would include distortion, so I was using the first 0.1dB as the real thing, so to speak.

Well that showed 2.65V (with lots more digits) on my Keithley meter.  But as I kept increasing the input (which continued to show 0.1dB) something very curious was happening.  The RMS reading kept getting higher and higher.

This is something I don't understand, but it suggests that the digital is not as peak limited as I had thought, which once again brings up the question, did I need to buy anything at all?

I'm thinking there might be some overflow region just above 0dB--as if you can have slightly higher values an 0dB.  Or, it might be that the higher amplitude getting digitally clipped just produces a higher RMS value and the non-clipped portion gets wider.  In the latter, now somewhat more plausible case, it would show that digital clipping is at least no worse than analog clipping.  Whenever you turn up the level, digitally or in the analog domain, you risk clipping later when the music gets louder.  If digital clipping is no worse than analog clipping, digital gain is no more risky than analog gain, including the lazy kind that comes from a DAC having maximum voltage higher than necessary to clip the amplifier.

Updates: I've discovered that there's no mystery as to why the level kept increasing.  The output is clipped exactly at 0dB, but as the input level increases, the sides get straighter, essentially the peak portion of the output gets larger, hence a larger RMS value even though the actual peak value at clipping is unchanged.  I've written about this in a later post.

My idea of having everything "clip" at the same level is nothing new, it's what Home Theater people call "gain structure," and it is The Correct way to do things.  If you haven't tuned the gain structure properly, you are losing resolution or dynamic range.  For my FPB 300 amplifier, the driver should have 3V RMS output, as the DAC I special ordered has.  Here's a very good discussion of Gain Structure.








Wednesday, May 10, 2017

High Output Speakers

A friend recently offered to sell me a somewhat broken (but highly repairable using the endlessly available preowned and not yet worn out parts on eBay) pair of some famous and popular rock n roll speakers, Pioneer HPM 100, for a very low price.  Though this speaker wasn't on the top of my list for future high output experiments, it's a contender, and for the price I'm paying it can also work as a garage speaker, which I need for my keyboard setup.  Even needing repair, you can tell these speakers have a punchy sound.

HPM 100 will flap out some pretty high output levels, though I haven't been able to find any exacting specifications like I can find now for the likes of JBL 4425, which are specifically rated to 114dB SPL (we could assume that's "flat" rating since it's program material and not noise--for which dBA is intended) with a graph showing the specific MOL for each frequency (114dB is only for a narrow region in the bass, the MOL at higher frequencies is somewhat lower).

Here's a list of vintage JBL speakers from which I identified the 4425 as being of interest.  Also worth looking at are 4411, 4412, 4430, and 4435.  You can tell from their parts that they are designed for high output: big woofer(s), big midrange or horns.  The maximum output of the 4411 isn't described in one number (looks to be above 105dB from the graphs shown, so I'd guestimate around 110dB) but the 4430 and 4435 are rated at 119dB and 122dB respectively and for continuous sound output (!!!).  That's what horns and big woofers can do!

Paul Klipsch (and many others) basically speak in code when they describe the need for "efficiency." In his famous speech, Klipsch says you don't need high efficiency anymore to get high output, you can simply use a high power amplifier.  But then he says with high efficiency your amplifier will distort less and therefore sound better.

It seems to me that if you have a suitably high quality amplifier, one that maintains low distortion levels even at high power output, and there are many such amplifiers today, Klipsch's second argument (that your amplifier will distort less) is as useless as the first.  Neither argument actually justifies high efficiency speakers.  In fact, quality high power amplifiers will generally have less S+N at high output than below 1 watt.  It was only 20 years ago that Nelson Pass praised the First Watt as being the most important, as for the previous decades it was seeming like the first watt was virtually ignored except by those demanding Class A or Class AB+.

But Klipsch (and the current legions of flea amp cultists) are missing one key point.  It is much easier to make a high efficiency speaker play louder than a low efficiency speaker.  Not only does a low efficiency speaker require more power input, it must also dissipate a large fraction of that power as heat, usually in very limited areas such as voice coils.  And there are serious limits in most cases as to what can be dissipated there, except in hugely costly designs.

And it's well known by fair minded people that Klipschorn K's, and the many Altec and JBL high efficiency speakers will play much more loudly than less-than-megabuck audiophile low efficiency sealed boxes.

So when Paul Klipsch brags about efficiency, what he's really doing is giving you an excuse to tote those high _output_ speakers into your living room.  Because output levels up to at least 100 dB are essential to completely accurate reproduction.





Monday, May 8, 2017

The HDCD Boost

I now have clear test evidence that my Denon DVD-9000 (which is, btw, an incredible sounding machine, might even overtake my current king Onkyo RDV-1) implements the (in)famous HDCD +6dB boost.  I haven't checked the menu, perhaps there's a way to disable this (not that I'd necessarily want to though) but I don't think many other later machines even offer such a choice.

When the Denon plays back an HDCD with the 6dB boost, it will output up to 6dB higher voltage on the output, which I believe means about 4v RMS.  I didn't measure the voltage as such, I'm reading dB on digital meters downstream, with the analog to digital sampling done by my Lavry AD10 set to reference level -9dB (which is 4dB higher than the maximum gain, -13dB, which is suited for 2V, but I sorta knew and sometimes remembered was insufficient for HDCD's and perhaps DVD-Audio's, well now I now for sure that -9dB is the highest level setting on the Lavry when playing HDCD's on the Denon).

I recall a friend of mine thought that if 6dB higher level was such a good idea, that should be incorporated into normal CD playback.  It's certainly a potentially divisive competitive issue.  I could imagine HDCD standards recommending a 6dB boost over conventional CD playback on the grounds that it's really only "uncompressing" peaks that the producer might otherwise choose to compress.  But on the other hand competing standard groups insisting this boost is unfair if they don't have it also.

In my own mind I'm still undecided on this...now that it's a pretty moot point anyway, at least for some people interested only in selling new things and services.  I have lots of HDCD's and cherish them and cherish the HDCD playback on my Denon 9000.

One of the very valid criticisms of HDCD is that it is not truly "compatible" if the boost feature is used.  The CD playback will not have those boosts, and will therefore have inferior dynamic range to what it could have if truly compatible.  HDCD got a scathing review in Audio Magazine, my cherished resource of the time, and when I read that it was enough for me.  No HDCD I decided (not that I cared much then anyway...I was in poverty then and cherishing my aged and irreplaceable Sony 507ES player).  Only a year or so a non-audiophile friend of mine was raving about how wonderful HDCD was, and I believed him, and endless HDCD listening later, I still think HDCD is great, probably better overall than SACD and close to MQA.

Anyway, in a system like mine I must know the exact level settings required for every source, for which the numbers below are a first pass:

HDCD, Denon DVD-9000: -9
CD, all players: -13 (maximum gain, I'd go to -15 perhaps if that were available)

The Onkyo RDV-1 doesn't do HDCD.  Notably Denon uses an Apogee clock, Apogee was a sort of competitor of Pacific Microsonics.

I measured the two players using HDCD and non-HDCD disk seeming to have maximum loudness (digital peak levels, measured on Behringer DEQ):

Reference Recordings RR-82CD, Track 9
Denon DVD-9000
Lavry set to -9dB
Tact set to -2.0dB
L -2.7
R -2.3

Onkyo RDV-1 (everything else the same)
L -9.1
R -8.9

The Denon is putting out 6.4dB higher level in the Left channel, 6.6dB higher level in the Right.  Most of this seems to be HDCD boost.  To be sure, I tested a non-HDCD disk, William Orbit Strange Cargo III, track 2:

DVD-9000
L -10.5
R -9.1

RDV-1
L -11
R -10.7

Now, the difference is a mere 0.5dB in the left channel, but a surprising (not impossible given analog sampling) 1.6dB difference in the right channel.

Do I need a preamp to boost levels to the Lavry?  Not really, I can just toggle the Lavry switches (though that does, quickly, become a pain, I may be getting better at it).

At the -9dB setting HDCD's have very little wasted headroom, though at -13 CD's have about 2dB wasted headroom, that's not the biggest deal.

I need to test the 9000ES though.

A word about the sound.  After returning home from vacation where I had a chance to listen to another system...there is no comparison in the lifelike scale and information density.  My system playing the RRCD 82 at levels above approaches very closely the awesomeness of being front row center in symphony hall (though I suspect my reproduction levels are somewhat lower, the noise level is lower too, perhaps there's still some residual modal response boosting the bass, somehow it works to sound real).  Most other audiophile systems I have heard sound like toy victrolas by comparison.  There are few systems I have ever heard that are about as good or better.  One is the super MBL system I heard in 2014 playing low generation tapes on a UHA machine.  It had the biggest omni midrange/tweeter system MBL makes stacked on top of itself in mirror image with equally outrageous electronics.  Like mine, that system is a force of nature.

UPDATE:  BIG NEWS ABOUT THE Sony DVD-9000ES

I measured the output level, and it was about 10dB too low from the output of the Sony 9000ES on Strange Cargo III, a non-HDCD CD.

This couldn't be right, I realized, not right at all.  So I hooked up the video monitor.  The remote control seems to have no "Setup" control.  Buy my programmed Harmony remote does, and from that I entered the Audio Menu.

Right on the top, there AUDIO ATT: and it was set to ON.  I changed that and immediately the level was restored to more or less the "correct" level (I am now using max gain for CD's so the numbers above aren't applicable).

and of course WOW DOES THIS MAKE A DIFFERENCE!!!  The formerly wimpy Sony CD sound is wimpy no more.  I was all wrong, this is a balsy sounding player not unlike the Denon DVD-9000.

I checked the manual the default and recommended setting is for AUDIO ATT to be OFF.   This is particularly strange in that the previous owner seemed to know a lot about this machine.

Now the CD levels of all my players are about the same, within 1-2dB or so anyway.  Previously the Sony was way off, and I had never realized how much.  I though I needed some nice preamp to boost the level.