Immersed in an overwhelming sea of auditory information, I spin tales about my "force of nature" audio system, then whip out my phone to measure it, and it's showing only 65dB. What??? "Loud" should be 100dB, or at least 90.
Firstly, I must always remember to set meter for "C" weighting when measuring loud. When the level is cranked up, we don't need the audibility curves of the "A" weighting. When set to C weighting, my measurement goes up to 75dB. OK, when the ambient noise is as low (around 10dBA) as it gets at my house at night, 75dB average music power can be enveloping.
Second, I can easily crank the level up to 85dBC peaks. Beyond that, I sometimes have to use the digital gain of my digital preamp, which increases the digital signal above what the preamp is given (just by multiplying the numbers by some factor and dithering...).
Because I fear digital clipping worse than analog clipping, which I probably shouldn't, I have been scared to use digital gain (which might cause clipping later when the music gets louder) more than excess analog gain which many systems naturally have.
The way I stream (through Sonos, which has digital volume control, which has sometimes been turned down slightly) and digitally resample all other sources means that peak levels in the digital signals reaching my preamp are usually well below 0dB even at peak levels, so they can harmlessly be boosted back up, as long as it isn't done too far.
But also, hidden digital volume controls should generally be set to unity gain (which is often, but not always, maximum) and maybe I have to adjust my resampling. I've been studying my resampling for the past few weeks. It seems I can set the Lavry AD10 to maximum gain on all normal CD sources, played on the Denon DVD-9000, the Onkyo RDV-1, or the Sony 9000ES, without clipping the Lavry. In fact it seems to leave about 4dB or so of additional headroom above the nominal 2V outputs (I haven't actually measured them, but digital sources described as "2V" typically put out 2.15 or even 2.2V). When I'm playing HDCD's on the Denon, I need to make the Lavry less sensitive to allow for a 6dB higher level. I've taken to the idea of adjusting the Lavry to max gain when that is permissible both to get the highest level and the best sampling (though some earlier ad hoc findings suggested it was better set to somewhat lower gain than needed for greater headroom but I now know all my players already allow about 4dB headroom at max gain when only playing CD's. It was measuring CD output that I discovered the 9000ES was 10dB down, which couldn't be right, and then I discovered the Audio Menu in which the Audio Att had been turned on. I wasted no time in turning it off (which is the recommended and supposedly default position). This has radically changed my ideas about the 9000ES as a CD player: it's a wimp no more.
I also got to thinking about how playing through a DAC reduces my possible level. My Audio GD Dac 19 puts out 2.5V, which is more than the usual 2V for a single ended DAC, but that still isn't high enough to drive the Krell FPB 300 to maximum power, even though the rated sensitivity of the Krell is 2.35V (for 300W output).
This week I finally sat down to figure out how much DAC output would be sufficient. Basically you want to reach the "peak continuous" power of the amplifier, or in other words the peak power specified as if it were continuous (RMS). Handily Krell specifies this for the FPB amplifiers as "RMS Output Voltage," which is 60V for the FPB 300. That corresponds to an RMS power of 450W (60*60/8.0) into 8 ohms. This has come up in reviews as the clipping power of the Krell, Martin Colloms measured 470W into 7.5 ohms (slightly higher because it's 7.5 ohms and not 8).
There are several reasons why the clipping power is greater than the rated power. For one, the Krell specification is conservative, all new and older units should be able to reach the rated power with some margin. For another, the rated power is at a specified low value of distortion, "clipping" is sometimes (e.g. Stereophile) defined as the onset of 1% THD. And for another, the unit might not be able to sustain the clipping power for long because of thermal or power supply considerations.
In fact it is only because the Krell has a regulated power supply that the Peak Voltage and Clipping Power line up so nicely. With unregulated power, the peak voltage might be considerably higher than even the continuous power for for than a cycle or two. For unregulated designs you might not be able to measure the peak continuous power at all, you would measure the actual peak and divide by 1.41, or you might take the rail voltage and subtract minimum junction voltage drops.
Anyway, when a DAC or preamplifier is driving a power amplifier, if reaching maximum loudness is desired, you must provide the input necessary to reach the "RMS Output Voltage" and not merely the rated power. This specification, which might be called maximum effective input voltage, can be computed in various ways, such as by the ratio of "rated power" and "peak continuous power" and the rated input sensitivity:
Sr(V) * sqrt ( Pcp(W) / Pr )
2.35 * sqrt ( 450 / 300 )
2.87 V
So *that* is the voltage the preamp must produce to drive the power amplifier to it's actual maximum. Anything less is robbing me of potential peak power, the ultimate peaks which are the stuff of music.
The 2.5 Volt output of my Audio GD Dac 19 comes close, but not quite.
Meanwhile, there is also too much of this driving voltage thing. Especially with a DAC, if the DAC voltage ranges from 0-5V and the amplifier rnages from 0-3V, the last 2V, which represents 40% of the coding or mapping space, is lost to technically useless headroom. So the resolution in the lower 60% is now diluted by exactly that much. Another way of looking at this is in the dynamic range. If the top 4dB is wasted, that means the relative ultimate noise level is now increased by 4dB. But this doesn't only matter if you're playing at such a low level, the resolution is lost at every volume and appears as added noise (so low it's never noticed directly).
You could argue this dilution always occurs, even when you are simply digitally attenuating. As it turns out, both analog and digital attenuators lose resolution, and about by the same degree.
Anyway, when the noise level is -120dB, and you're reducing it to -116, well you'd hardly notice. Believe me I used resolution reducing (and noise increasing) digital attenuation destructive in my early years of digital EQ, causing far more loss than this, and didn't notice until I stopped to think about it.
I think most people with hybrid analog/digital don't think about it probably enough. Surely I haven't at times. Now you might say, I'm over obsessed about it.
Anyway, I though very much about it this week. Thinking the desired voltage to be about 3V, my mere 2.5V was missing the mark by a crucial 1.5ddB, the difference between say, 340W and 450W, the ultimate peak power of the amplifier. I pay big bucks for an an amplifier with all that peak power, the stuff of music, and then just throw a bunch of it away.
Got to fix this problem, I said. I could get a quality preamp to follow the output of the Audio GD DAC 19, and boost it to 3V RMS. I could get a used KRC, for example, and have it drive the Krell balanced from the single ended output of the DAC. That's about the minimum quality of device I would consider.
Now this might be worth considering in a 1-way system. Then I could use analog volume instead of digital volume. Arguable analog volume is better, though digital volume has lower noise, typically -144dB since digital volume is done on 24bit number (using even higher bit math with dithering).
But in a multiway system like mine, the remote volume would be useless. I would nearly set the preamp to the required gain of 1.5dB and leave it there.
It took barely a moments thought to feel how monsterous it would be to add all this complexity just to get 1.5 dB more output. I'm not that big on simplicity, but this seemed monsterous. Then, I could build a canonical OPA 211 based buffer, but the work is still unimaginable to me, making boards and doing the fine soldering, and it would all have to be to the highest standards to be good enough, including the separate power supply.
This crystallized with something else, and the next thing was I'd done something barely thinkable, I'd order a multi kilobuck DAC.
The other thing was I was testing the output levels of all my disc players through the Lavry. The best way to make the measurement of output is by reading the "Peak" level meters of the downstream midrange DEQ reading input digital level. I found one of the highest peak levels I'd ever seen playing the Reference Recordings disc RR-82, with Mephisto and other, on the Denon DVD-9000 with HDCD. Track 9 has the highest level seen. I had to set the level on the Lavry to -9dB (the level for CD's is the max gain level -13dB, which is 4dB more sensitive). This ultimately left me with 0.3dB headroom, as determined by subtracting the -2.0 digital gain on the Tact (tact level 91.8) from the -2.3 dB peak reading in the Right channel (the other channel read -2.7dB).
Playing the disc on the Onkyo, which has no HDCD decoding, yielded -7.1dB and -6.9dB peaks, back to standard CD level and even that the Onkyo plays about a dB lower than the Denon.
In further testing, the un-attenuated Sony 9000ES appears to have the identical output levels, within 0.1dB, on regular CD's as the Denon. The Onkyo has about 1dB less output.
In playing several CD tracks on all players in a row for doing peak level tests, I quickly had a favorite: the Denon DVD-9000. In its new un-attenuated form, the Sony 9000ES is #2, and the Onkyo--my heretofore standard on all the things it plays--ranked 3rd. Of course I made no attempt to equalize the 1dB lack of volume from the Onkyo, so this "finding" is as suspect as anything, let alone the fact that most listening was done from the floor as I was reading the peak levels.
Anyway, I concluded, and combined with previous knowledge, that differential 1704's are far better than single ended 1704's for digital conversion. The Denon has differential 1704's, and that may help make for the smoother sound, silky smooth, compared to the fine stainless steel granularity of the Onkyo. I know that the Burr Brown PCM 1704's are not perfect, though they are the best ever implementation of 24 bits of R2R non-feedback PCM conversion. They are not the most glitch free, the never ending popularity of the original digital chips, the Phillips 1571A, stems in large part from their relative absense of glitches. Anyway the 1704's shine in differential and dual differential implementation where the glitching mostly cancels out. That is seen in all the big league units, not just my rarity DVD-9000 but in the best Levinsons like the 360S, the 30.6, and so on. I looked at those briefly, the more common 30.5 uses the earlier 1702 chips which some preferred but are only 20 bit chips. I'd consider those.
Well this brought me back to how much I would prefer to have the top Audio GD DAC, the Master 7. It has the ultimately perfected dual differential 1704 implementation.
Designer Kingwa of Audio GD is one of the great audio designers, up there with Curl and Pass. Sure, he's a self promoter, a cult of personality, a huckster, but they all are. Like the good ones, though, he seems to do the hard work. He uses good stuff, and puts it together to achieve the very best results, with maniacal attention to details and the perfection of each part of the circuitry as well as how they all work together.
Because of how he attends to the details of how well his circuits actually work, rather than just slopping something together that barely works (such products used to be legion) he gets to the maximum linearity achievable, so so latest Master 7 DAC is rated as 0.00005% distortion and he shows the spectrum. At first I was put off by the obvious -110dB peaks. But -110dB is 0.000003%, more than 10 times less than the 0.00005% specification. And he does this with no feedback, no capacitors, direct coupled discrete fets and possibly more regulated power supplies than Levinson, at a somewhat attainable (gasp) cost.
By comparison, the PS Audio DirectStream DAC (which might have some merit...I currently hold the idea that true one bit conversion, highly oversampled, has better sonics than delta sigma) is only specified as 0.03% distortion. Sad to say, that looks like lazy engineering (though it could also be very conservative specsmanship, in the day an age when subjectophiles believe the higher the distortion the better). And we're up to the big bucks, paupers need not apply, if not even close to the swiss or german prices.
The actual only other possible choice to me was the Schitt Yggdragasil, which uses modern PCM DAC's in a proprietary "closed loop" solution--ultimate performance though advance calculation rather than sigma delta feedback. That has always appealed to me, and may be actually better or worse, I don't know, but Stereophile something that looked like digital-related distortion and was unwilling to specify the "effective bits", and the AD chips *are* only 21 bits, not the 24 bits of the legendary 1704. There's a long argument here, but I believe if the "bit loss" is merely thermal, or sufficiently random, or whatever, it doesn't count as badly as not trying to approximate the bits at all. This is related to my view of noise and resolution as being somewhat different things, ultimately. So it counts to go after the 24 bit resolution even if you don't achieve it. So I've had a strong leaning toward the actual 1704 products like the historic Levinson DACs.
But putting all these considerations together, the solution was clear for once. I should finally order the Master 7, which I'd long desired, and have it modified for a 0-3V range instead of the 0-5 normal production. If I can even still get a Master 7. It has almost seemed every year Kingwa says there will be no more, and I believed him years ago.
This had become a much more important track to future progress than buying a mega expensive preamp, which had been one of my hopes for this year, something like a Levinson 326S, or a 32, or, just for starters, a Krell KRC 3, to boost the DAC output, or boost the CD player inputs (this seemed more crucial before I turned off the attenuator on the 9000ES). I'd be getting the much improved dual differential 1704 configureation, and true balanced outputs which work best with my Krell FPB which has true balanced circuitry throughout and benefits circuit-wise from balanced connection above and beyond the benefits of balance on lower random noise induction and AC chassis ground differences--a not inconsiderably consideration since the Krell is plugged straight into the wall whereas the DAC is plugged into a special strip plugged into the power conditioner. Balanced connection means the ground differences virtually don't matter. It's a wonder things have been working so well unbalanced! I shoulda swung for balanced in the first place (though, for what it matters when I got the Audio GD DAC 19 I was using the Aragon amp, which doesn't have a true balanced input only a virtual balanced adapter to a single input. That doesn't really gain an advantage from balanced input in any way.
I checked the website, and saw it's now the new Master 7 Singularity, which Kingwa mightily emphasizes is the ultimate end of the 1704's, the last 100 units can be made and that is it. It's supposed to be even better than all the previous Master 7's, especially in the new digital, but I'm not sure when or if I'm ever going to be able to use I2S and likewise even USB. An AES connection, my first choice requirement (I was using an AES to coax converter with my DAC 19) is $15 extra.
I sent of the email, requesting my desired output voltage. I doubted this was possible, thinking the Master 7's already all made by now if not spoken for, there might not be any such choice, and the options page didn't suggest any such thing. I was thinking to myself, I really set myself up, if the reply is they will do this I will go ahead, this being my last chance most likely to get DAC with the correct custom output voltage range, and dual differential 1704's as well, and top performing in every way.
It didn't take long, and it doesn't even seem like I'm paying anything for the voltage change.
Well this was a lot of money to swing, for me the guy who not just a few years ago was saying DAC's don't matter much. So I've had a lot to chew on, for the day before making the order I was in a near panic trying to ensure my custom voltage range idea was actually a good idea. And I was worried sick for several days after that it wasn't, or perhaps I should have specified 3.3V or 4V and so on. (4V was the balanced output of the historic Levinson DAC's I've lusted for, PS Audio offers 3.15V and 5.3V. The loudest DACs will almost certainly in many cases be judged the best, so everyone makes theirs a tad louder than the previous crop, not for the best reasons perhaps.)
The "wasted" extra headroom of a DAC isn't entirely wasted if it allows the user to get additional useful gain they otherwise wouldn't get. But I am not such a person, I have at least 6.1dB of available digital gain through my Tact digital preamp. I also have also been ignoring the up to 15 dB of digital gain available through my Behringer DEQ units used as crossovers and shapers. Though I've actually already been using some of that, +7dB in the supertweeter unit, so there is actually "only" 8dB of additional digital gain available.
I've only been fearful of using too much digital gain. When that hits, I worry about digital harmonics being generated, huge horrible clipping noises and so on, the native clipping might be tiddlywinks in comparison. I don't know...and that's still exactly the problem! I should find out exactly how bad digital clipping is, anyway, since it's so easily attainable (and attenuate the levels so as not to produce amplifier clipping...which is more frightening for doing tests on, particularly when you have an amp rated for 1600W of power delivery).
My first brush is that it may not be bad at all. It was, after all, way overvoltages (the 10V output of the DCX driving a 250W amplifier driving the Elacs) that nearly fried my ribbons. It was not, I don't think, digital clipping as such.
Anyway, I also needed to be sure, right away, that 2.5V was a true RMS voltage. It was, in fact my measurement was that it was putting out 2.67V RMS for a 0dB signal...ah yes the slight boost over 2.5V "nominal" it always seems to go that way, and to be fair when they say 2.5V that's a value they intend to reach even under cold temperature, etc.).
But then that raises multiple questions about a simple "3V output" specification...and the very first being...why buy a new DAC at all!!! With 2.67 I was already getting very close to the 2.87 I calculated I might actually need. When I was putting the issue as "2.5 vs 3.0" it was 1.56dB difference, not inconsiderable. But the difference of 2.67 to 2.87 is a mere 0.62dB. Is that worth paying thousands of dollars for?
But: the balanced outputs I needed from the beginning, the dual differential 1704's are needed, the balanced AES input nice too. This is, as I said, getting late in the game to get a machine like this new from the manufacturer to custom spec.
OK, then, if one really needed a boost so much, it must have been more than needing the last 0.62dB. It must have been the lazy analog gain of driving the amp with higher output, as I was doing with the old 10V RMS output DCX units before I started using 1704 based DAC's. That makes it easy to get high output, it requires lots and lots of attenuation, like running the Tact it the "70s" range, to get normal levels. I used to do that, throwing resolution to the wind.
But 5V is long around the 4V, or 3.5V or whatever I figured would be optimum. I never before did the actual calculation. And honestly the calculation might be wrong...but it could be wrong either way. But say I could use more output to raise the voltage somewhat if with, say, less than 10% distortion? Should I do that?
It's a tough call. I do wonder if I might do better with 3.3V or 3.2V than the 3V I asked for. Though it now looks like I'll probably get around 3.2V measured due to the nominal thing I discussed before. It's almost certainly not going to be less than 3V under any circumstances. Distortion is at least audibly going to be kicking in about then.
I was actually going to ask for 3.2V...though figuring I'd get about that anyway. But then, at the last minute, seeing the 3V nicely printed up on the invoice, I decided to just go with that. I think I'm right up there with the peak output of the amp, might be some benefit from being able to go a little further to 10% distortion perhaps, and not much resolution wasted to worry about. It's often said that it's the onset of analog distortion that makes people think things are too loud, or maybe even just "loud enough." That's what I strongly believed when a friend of mine kept wanting me to crank up "White Wedding" louder and louder.
I've given up the lazy extra gain of 5V that I think many audiophiles just use, straight into their power amps, without thinking what it does that way. It's a convenient way to get more volume without fancy DSP devices. Anyway, I've given up that easy road for an idea, that using digital gain to push low level inputs is better than wasted headroom. One thing about the lazy gain is that you end up hitting analog clipping, which might be sort-of OK, happens all the time, while the digital clipping that results from using lots of digital gain when the peaks are higher than anticipated, might be destructive. I suspect now, that it's the other way around.
A funny thing happened when I started cranking up the digital sine wave above 0dB. I could only tell that I was at 0dB looking at the reading on the peak level indicator for the digital input to the DEQ. I got to 0.1 peak level, then advanced -0.1dB, and it stayed at -0.1. I kept advancing and it continued to just stay at -0.1. So obviously 0.1 was the highest level I can read, or virtually 0dB. Perhaps the -0.1dB I get after already getting -0.1dB would include distortion, so I was using the first 0.1dB as the real thing, so to speak.
Well that showed 2.65V (with lots more digits) on my Keithley meter. But as I kept increasing the input (which continued to show 0.1dB) something very curious was happening. The RMS reading kept getting higher and higher.
This is something I don't understand, but it suggests that the digital is not as peak limited as I had thought, which once again brings up the question, did I need to buy anything at all?
I'm thinking there might be some overflow region just above 0dB--as if you can have slightly higher values an 0dB. Or, it might be that the higher amplitude getting digitally clipped just produces a higher RMS value and the non-clipped portion gets wider. In the latter, now somewhat more plausible case, it would show that digital clipping is at least no worse than analog clipping. Whenever you turn up the level, digitally or in the analog domain, you risk clipping later when the music gets louder. If digital clipping is no worse than analog clipping, digital gain is no more risky than analog gain, including the lazy kind that comes from a DAC having maximum voltage higher than necessary to clip the amplifier.
Updates: I've discovered that there's no mystery as to why the level kept increasing. The output is clipped exactly at 0dB, but as the input level increases, the sides get straighter, essentially the peak portion of the output gets larger, hence a larger RMS value even though the actual peak value at clipping is unchanged. I've written about this in a later post.
My idea of having everything "clip" at the same level is nothing new, it's what Home Theater people call "gain structure," and it is The Correct way to do things. If you haven't tuned the gain structure properly, you are losing resolution or dynamic range. For my FPB 300 amplifier, the driver should have 3V RMS output, as the DAC I special ordered has. Here's a very good discussion of Gain Structure.
Firstly, I must always remember to set meter for "C" weighting when measuring loud. When the level is cranked up, we don't need the audibility curves of the "A" weighting. When set to C weighting, my measurement goes up to 75dB. OK, when the ambient noise is as low (around 10dBA) as it gets at my house at night, 75dB average music power can be enveloping.
Second, I can easily crank the level up to 85dBC peaks. Beyond that, I sometimes have to use the digital gain of my digital preamp, which increases the digital signal above what the preamp is given (just by multiplying the numbers by some factor and dithering...).
Because I fear digital clipping worse than analog clipping, which I probably shouldn't, I have been scared to use digital gain (which might cause clipping later when the music gets louder) more than excess analog gain which many systems naturally have.
The way I stream (through Sonos, which has digital volume control, which has sometimes been turned down slightly) and digitally resample all other sources means that peak levels in the digital signals reaching my preamp are usually well below 0dB even at peak levels, so they can harmlessly be boosted back up, as long as it isn't done too far.
But also, hidden digital volume controls should generally be set to unity gain (which is often, but not always, maximum) and maybe I have to adjust my resampling. I've been studying my resampling for the past few weeks. It seems I can set the Lavry AD10 to maximum gain on all normal CD sources, played on the Denon DVD-9000, the Onkyo RDV-1, or the Sony 9000ES, without clipping the Lavry. In fact it seems to leave about 4dB or so of additional headroom above the nominal 2V outputs (I haven't actually measured them, but digital sources described as "2V" typically put out 2.15 or even 2.2V). When I'm playing HDCD's on the Denon, I need to make the Lavry less sensitive to allow for a 6dB higher level. I've taken to the idea of adjusting the Lavry to max gain when that is permissible both to get the highest level and the best sampling (though some earlier ad hoc findings suggested it was better set to somewhat lower gain than needed for greater headroom but I now know all my players already allow about 4dB headroom at max gain when only playing CD's. It was measuring CD output that I discovered the 9000ES was 10dB down, which couldn't be right, and then I discovered the Audio Menu in which the Audio Att had been turned on. I wasted no time in turning it off (which is the recommended and supposedly default position). This has radically changed my ideas about the 9000ES as a CD player: it's a wimp no more.
I also got to thinking about how playing through a DAC reduces my possible level. My Audio GD Dac 19 puts out 2.5V, which is more than the usual 2V for a single ended DAC, but that still isn't high enough to drive the Krell FPB 300 to maximum power, even though the rated sensitivity of the Krell is 2.35V (for 300W output).
This week I finally sat down to figure out how much DAC output would be sufficient. Basically you want to reach the "peak continuous" power of the amplifier, or in other words the peak power specified as if it were continuous (RMS). Handily Krell specifies this for the FPB amplifiers as "RMS Output Voltage," which is 60V for the FPB 300. That corresponds to an RMS power of 450W (60*60/8.0) into 8 ohms. This has come up in reviews as the clipping power of the Krell, Martin Colloms measured 470W into 7.5 ohms (slightly higher because it's 7.5 ohms and not 8).
There are several reasons why the clipping power is greater than the rated power. For one, the Krell specification is conservative, all new and older units should be able to reach the rated power with some margin. For another, the rated power is at a specified low value of distortion, "clipping" is sometimes (e.g. Stereophile) defined as the onset of 1% THD. And for another, the unit might not be able to sustain the clipping power for long because of thermal or power supply considerations.
In fact it is only because the Krell has a regulated power supply that the Peak Voltage and Clipping Power line up so nicely. With unregulated power, the peak voltage might be considerably higher than even the continuous power for for than a cycle or two. For unregulated designs you might not be able to measure the peak continuous power at all, you would measure the actual peak and divide by 1.41, or you might take the rail voltage and subtract minimum junction voltage drops.
Anyway, when a DAC or preamplifier is driving a power amplifier, if reaching maximum loudness is desired, you must provide the input necessary to reach the "RMS Output Voltage" and not merely the rated power. This specification, which might be called maximum effective input voltage, can be computed in various ways, such as by the ratio of "rated power" and "peak continuous power" and the rated input sensitivity:
Sr(V) * sqrt ( Pcp(W) / Pr )
2.35 * sqrt ( 450 / 300 )
2.87 V
So *that* is the voltage the preamp must produce to drive the power amplifier to it's actual maximum. Anything less is robbing me of potential peak power, the ultimate peaks which are the stuff of music.
The 2.5 Volt output of my Audio GD Dac 19 comes close, but not quite.
Meanwhile, there is also too much of this driving voltage thing. Especially with a DAC, if the DAC voltage ranges from 0-5V and the amplifier rnages from 0-3V, the last 2V, which represents 40% of the coding or mapping space, is lost to technically useless headroom. So the resolution in the lower 60% is now diluted by exactly that much. Another way of looking at this is in the dynamic range. If the top 4dB is wasted, that means the relative ultimate noise level is now increased by 4dB. But this doesn't only matter if you're playing at such a low level, the resolution is lost at every volume and appears as added noise (so low it's never noticed directly).
You could argue this dilution always occurs, even when you are simply digitally attenuating. As it turns out, both analog and digital attenuators lose resolution, and about by the same degree.
Anyway, when the noise level is -120dB, and you're reducing it to -116, well you'd hardly notice. Believe me I used resolution reducing (and noise increasing) digital attenuation destructive in my early years of digital EQ, causing far more loss than this, and didn't notice until I stopped to think about it.
I think most people with hybrid analog/digital don't think about it probably enough. Surely I haven't at times. Now you might say, I'm over obsessed about it.
Anyway, I though very much about it this week. Thinking the desired voltage to be about 3V, my mere 2.5V was missing the mark by a crucial 1.5ddB, the difference between say, 340W and 450W, the ultimate peak power of the amplifier. I pay big bucks for an an amplifier with all that peak power, the stuff of music, and then just throw a bunch of it away.
Got to fix this problem, I said. I could get a quality preamp to follow the output of the Audio GD DAC 19, and boost it to 3V RMS. I could get a used KRC, for example, and have it drive the Krell balanced from the single ended output of the DAC. That's about the minimum quality of device I would consider.
Now this might be worth considering in a 1-way system. Then I could use analog volume instead of digital volume. Arguable analog volume is better, though digital volume has lower noise, typically -144dB since digital volume is done on 24bit number (using even higher bit math with dithering).
But in a multiway system like mine, the remote volume would be useless. I would nearly set the preamp to the required gain of 1.5dB and leave it there.
It took barely a moments thought to feel how monsterous it would be to add all this complexity just to get 1.5 dB more output. I'm not that big on simplicity, but this seemed monsterous. Then, I could build a canonical OPA 211 based buffer, but the work is still unimaginable to me, making boards and doing the fine soldering, and it would all have to be to the highest standards to be good enough, including the separate power supply.
This crystallized with something else, and the next thing was I'd done something barely thinkable, I'd order a multi kilobuck DAC.
The other thing was I was testing the output levels of all my disc players through the Lavry. The best way to make the measurement of output is by reading the "Peak" level meters of the downstream midrange DEQ reading input digital level. I found one of the highest peak levels I'd ever seen playing the Reference Recordings disc RR-82, with Mephisto and other, on the Denon DVD-9000 with HDCD. Track 9 has the highest level seen. I had to set the level on the Lavry to -9dB (the level for CD's is the max gain level -13dB, which is 4dB more sensitive). This ultimately left me with 0.3dB headroom, as determined by subtracting the -2.0 digital gain on the Tact (tact level 91.8) from the -2.3 dB peak reading in the Right channel (the other channel read -2.7dB).
Playing the disc on the Onkyo, which has no HDCD decoding, yielded -7.1dB and -6.9dB peaks, back to standard CD level and even that the Onkyo plays about a dB lower than the Denon.
In further testing, the un-attenuated Sony 9000ES appears to have the identical output levels, within 0.1dB, on regular CD's as the Denon. The Onkyo has about 1dB less output.
In playing several CD tracks on all players in a row for doing peak level tests, I quickly had a favorite: the Denon DVD-9000. In its new un-attenuated form, the Sony 9000ES is #2, and the Onkyo--my heretofore standard on all the things it plays--ranked 3rd. Of course I made no attempt to equalize the 1dB lack of volume from the Onkyo, so this "finding" is as suspect as anything, let alone the fact that most listening was done from the floor as I was reading the peak levels.
Anyway, I concluded, and combined with previous knowledge, that differential 1704's are far better than single ended 1704's for digital conversion. The Denon has differential 1704's, and that may help make for the smoother sound, silky smooth, compared to the fine stainless steel granularity of the Onkyo. I know that the Burr Brown PCM 1704's are not perfect, though they are the best ever implementation of 24 bits of R2R non-feedback PCM conversion. They are not the most glitch free, the never ending popularity of the original digital chips, the Phillips 1571A, stems in large part from their relative absense of glitches. Anyway the 1704's shine in differential and dual differential implementation where the glitching mostly cancels out. That is seen in all the big league units, not just my rarity DVD-9000 but in the best Levinsons like the 360S, the 30.6, and so on. I looked at those briefly, the more common 30.5 uses the earlier 1702 chips which some preferred but are only 20 bit chips. I'd consider those.
Well this brought me back to how much I would prefer to have the top Audio GD DAC, the Master 7. It has the ultimately perfected dual differential 1704 implementation.
Designer Kingwa of Audio GD is one of the great audio designers, up there with Curl and Pass. Sure, he's a self promoter, a cult of personality, a huckster, but they all are. Like the good ones, though, he seems to do the hard work. He uses good stuff, and puts it together to achieve the very best results, with maniacal attention to details and the perfection of each part of the circuitry as well as how they all work together.
Because of how he attends to the details of how well his circuits actually work, rather than just slopping something together that barely works (such products used to be legion) he gets to the maximum linearity achievable, so so latest Master 7 DAC is rated as 0.00005% distortion and he shows the spectrum. At first I was put off by the obvious -110dB peaks. But -110dB is 0.000003%, more than 10 times less than the 0.00005% specification. And he does this with no feedback, no capacitors, direct coupled discrete fets and possibly more regulated power supplies than Levinson, at a somewhat attainable (gasp) cost.
By comparison, the PS Audio DirectStream DAC (which might have some merit...I currently hold the idea that true one bit conversion, highly oversampled, has better sonics than delta sigma) is only specified as 0.03% distortion. Sad to say, that looks like lazy engineering (though it could also be very conservative specsmanship, in the day an age when subjectophiles believe the higher the distortion the better). And we're up to the big bucks, paupers need not apply, if not even close to the swiss or german prices.
The actual only other possible choice to me was the Schitt Yggdragasil, which uses modern PCM DAC's in a proprietary "closed loop" solution--ultimate performance though advance calculation rather than sigma delta feedback. That has always appealed to me, and may be actually better or worse, I don't know, but Stereophile something that looked like digital-related distortion and was unwilling to specify the "effective bits", and the AD chips *are* only 21 bits, not the 24 bits of the legendary 1704. There's a long argument here, but I believe if the "bit loss" is merely thermal, or sufficiently random, or whatever, it doesn't count as badly as not trying to approximate the bits at all. This is related to my view of noise and resolution as being somewhat different things, ultimately. So it counts to go after the 24 bit resolution even if you don't achieve it. So I've had a strong leaning toward the actual 1704 products like the historic Levinson DACs.
But putting all these considerations together, the solution was clear for once. I should finally order the Master 7, which I'd long desired, and have it modified for a 0-3V range instead of the 0-5 normal production. If I can even still get a Master 7. It has almost seemed every year Kingwa says there will be no more, and I believed him years ago.
This had become a much more important track to future progress than buying a mega expensive preamp, which had been one of my hopes for this year, something like a Levinson 326S, or a 32, or, just for starters, a Krell KRC 3, to boost the DAC output, or boost the CD player inputs (this seemed more crucial before I turned off the attenuator on the 9000ES). I'd be getting the much improved dual differential 1704 configureation, and true balanced outputs which work best with my Krell FPB which has true balanced circuitry throughout and benefits circuit-wise from balanced connection above and beyond the benefits of balance on lower random noise induction and AC chassis ground differences--a not inconsiderably consideration since the Krell is plugged straight into the wall whereas the DAC is plugged into a special strip plugged into the power conditioner. Balanced connection means the ground differences virtually don't matter. It's a wonder things have been working so well unbalanced! I shoulda swung for balanced in the first place (though, for what it matters when I got the Audio GD DAC 19 I was using the Aragon amp, which doesn't have a true balanced input only a virtual balanced adapter to a single input. That doesn't really gain an advantage from balanced input in any way.
I checked the website, and saw it's now the new Master 7 Singularity, which Kingwa mightily emphasizes is the ultimate end of the 1704's, the last 100 units can be made and that is it. It's supposed to be even better than all the previous Master 7's, especially in the new digital, but I'm not sure when or if I'm ever going to be able to use I2S and likewise even USB. An AES connection, my first choice requirement (I was using an AES to coax converter with my DAC 19) is $15 extra.
I sent of the email, requesting my desired output voltage. I doubted this was possible, thinking the Master 7's already all made by now if not spoken for, there might not be any such choice, and the options page didn't suggest any such thing. I was thinking to myself, I really set myself up, if the reply is they will do this I will go ahead, this being my last chance most likely to get DAC with the correct custom output voltage range, and dual differential 1704's as well, and top performing in every way.
It didn't take long, and it doesn't even seem like I'm paying anything for the voltage change.
Well this was a lot of money to swing, for me the guy who not just a few years ago was saying DAC's don't matter much. So I've had a lot to chew on, for the day before making the order I was in a near panic trying to ensure my custom voltage range idea was actually a good idea. And I was worried sick for several days after that it wasn't, or perhaps I should have specified 3.3V or 4V and so on. (4V was the balanced output of the historic Levinson DAC's I've lusted for, PS Audio offers 3.15V and 5.3V. The loudest DACs will almost certainly in many cases be judged the best, so everyone makes theirs a tad louder than the previous crop, not for the best reasons perhaps.)
The "wasted" extra headroom of a DAC isn't entirely wasted if it allows the user to get additional useful gain they otherwise wouldn't get. But I am not such a person, I have at least 6.1dB of available digital gain through my Tact digital preamp. I also have also been ignoring the up to 15 dB of digital gain available through my Behringer DEQ units used as crossovers and shapers. Though I've actually already been using some of that, +7dB in the supertweeter unit, so there is actually "only" 8dB of additional digital gain available.
I've only been fearful of using too much digital gain. When that hits, I worry about digital harmonics being generated, huge horrible clipping noises and so on, the native clipping might be tiddlywinks in comparison. I don't know...and that's still exactly the problem! I should find out exactly how bad digital clipping is, anyway, since it's so easily attainable (and attenuate the levels so as not to produce amplifier clipping...which is more frightening for doing tests on, particularly when you have an amp rated for 1600W of power delivery).
My first brush is that it may not be bad at all. It was, after all, way overvoltages (the 10V output of the DCX driving a 250W amplifier driving the Elacs) that nearly fried my ribbons. It was not, I don't think, digital clipping as such.
Anyway, I also needed to be sure, right away, that 2.5V was a true RMS voltage. It was, in fact my measurement was that it was putting out 2.67V RMS for a 0dB signal...ah yes the slight boost over 2.5V "nominal" it always seems to go that way, and to be fair when they say 2.5V that's a value they intend to reach even under cold temperature, etc.).
But then that raises multiple questions about a simple "3V output" specification...and the very first being...why buy a new DAC at all!!! With 2.67 I was already getting very close to the 2.87 I calculated I might actually need. When I was putting the issue as "2.5 vs 3.0" it was 1.56dB difference, not inconsiderable. But the difference of 2.67 to 2.87 is a mere 0.62dB. Is that worth paying thousands of dollars for?
But: the balanced outputs I needed from the beginning, the dual differential 1704's are needed, the balanced AES input nice too. This is, as I said, getting late in the game to get a machine like this new from the manufacturer to custom spec.
OK, then, if one really needed a boost so much, it must have been more than needing the last 0.62dB. It must have been the lazy analog gain of driving the amp with higher output, as I was doing with the old 10V RMS output DCX units before I started using 1704 based DAC's. That makes it easy to get high output, it requires lots and lots of attenuation, like running the Tact it the "70s" range, to get normal levels. I used to do that, throwing resolution to the wind.
But 5V is long around the 4V, or 3.5V or whatever I figured would be optimum. I never before did the actual calculation. And honestly the calculation might be wrong...but it could be wrong either way. But say I could use more output to raise the voltage somewhat if with, say, less than 10% distortion? Should I do that?
It's a tough call. I do wonder if I might do better with 3.3V or 3.2V than the 3V I asked for. Though it now looks like I'll probably get around 3.2V measured due to the nominal thing I discussed before. It's almost certainly not going to be less than 3V under any circumstances. Distortion is at least audibly going to be kicking in about then.
I was actually going to ask for 3.2V...though figuring I'd get about that anyway. But then, at the last minute, seeing the 3V nicely printed up on the invoice, I decided to just go with that. I think I'm right up there with the peak output of the amp, might be some benefit from being able to go a little further to 10% distortion perhaps, and not much resolution wasted to worry about. It's often said that it's the onset of analog distortion that makes people think things are too loud, or maybe even just "loud enough." That's what I strongly believed when a friend of mine kept wanting me to crank up "White Wedding" louder and louder.
I've given up the lazy extra gain of 5V that I think many audiophiles just use, straight into their power amps, without thinking what it does that way. It's a convenient way to get more volume without fancy DSP devices. Anyway, I've given up that easy road for an idea, that using digital gain to push low level inputs is better than wasted headroom. One thing about the lazy gain is that you end up hitting analog clipping, which might be sort-of OK, happens all the time, while the digital clipping that results from using lots of digital gain when the peaks are higher than anticipated, might be destructive. I suspect now, that it's the other way around.
A funny thing happened when I started cranking up the digital sine wave above 0dB. I could only tell that I was at 0dB looking at the reading on the peak level indicator for the digital input to the DEQ. I got to 0.1 peak level, then advanced -0.1dB, and it stayed at -0.1. I kept advancing and it continued to just stay at -0.1. So obviously 0.1 was the highest level I can read, or virtually 0dB. Perhaps the -0.1dB I get after already getting -0.1dB would include distortion, so I was using the first 0.1dB as the real thing, so to speak.
Well that showed 2.65V (with lots more digits) on my Keithley meter. But as I kept increasing the input (which continued to show 0.1dB) something very curious was happening. The RMS reading kept getting higher and higher.
This is something I don't understand, but it suggests that the digital is not as peak limited as I had thought, which once again brings up the question, did I need to buy anything at all?
I'm thinking there might be some overflow region just above 0dB--as if you can have slightly higher values an 0dB. Or, it might be that the higher amplitude getting digitally clipped just produces a higher RMS value and the non-clipped portion gets wider. In the latter, now somewhat more plausible case, it would show that digital clipping is at least no worse than analog clipping. Whenever you turn up the level, digitally or in the analog domain, you risk clipping later when the music gets louder. If digital clipping is no worse than analog clipping, digital gain is no more risky than analog gain, including the lazy kind that comes from a DAC having maximum voltage higher than necessary to clip the amplifier.
Updates: I've discovered that there's no mystery as to why the level kept increasing. The output is clipped exactly at 0dB, but as the input level increases, the sides get straighter, essentially the peak portion of the output gets larger, hence a larger RMS value even though the actual peak value at clipping is unchanged. I've written about this in a later post.
My idea of having everything "clip" at the same level is nothing new, it's what Home Theater people call "gain structure," and it is The Correct way to do things. If you haven't tuned the gain structure properly, you are losing resolution or dynamic range. For my FPB 300 amplifier, the driver should have 3V RMS output, as the DAC I special ordered has. Here's a very good discussion of Gain Structure.
No comments:
Post a Comment