I'm very much enjoying the sound of my new DVP-9000ES playing SACD's. I'm feeling that, indeed, Sony did something special, at least with these early SACD players, that I have not heard in non-Sony branded players.
Something different anyway, perhaps not all to the good I'm wondering. Nowadays even Sony uses highly oversampled multibit DAC chips, sometimes even from the likes of Burr Brown, to implement SACD, just like everyone else. Back when the first SACD players were introduced, Sony was using their Pulse Converter chips, CXA8042AS. Those pulse converter chips are used in both the SCD-1/777 and the DVP-9000ES. Outside of that, and the use of one OPA213, the output circuits are quite different between the super high end and merely upscale, with the super high end showing far more additional stuff, and discrete circuits in the output.
A leading audio engineer, Stanley Lipschutz, and several others, revealed inherent faults in 1 bit delta sigma in AES papers in 2000, just after the public release of the SCD-1. The principle fault with 1 bit delta sigma is that the background noise isn't gaussian, it's tonal, with idle tones.
Sony then denied, to John Atkinson, that Sony was using 1 bit conversion, in response to a paper by David Rich describing the concerns of Lipschutz and others, published (of all places) in Stereophile.
I'd always wondered if that denial was with regards to future SACD, not past. In the past machinery, as in all the machines I just mentioned, still for sale but designed years earlier, they might have used 1-bit, but that was now water under the bridge.
I don't know enough about these CXA8042AS chips actually to confirm my version of this, that Sony abandoned true 1-bit when it became apparent that it didn't down scale well, or they hit a brick wall in making further improvements, or were covering up faults all along, or were embarassed by Lipschutz, or something like that. They had been able to achieve brilliant sound in the 3 early players by perhaps mixing things up a bit (especially in the SCD-1/777ES) so that the 1-bit fundamental character was somewhat obscured. I think they may have been further oversampling the 1-bit, IIRC 70Mhz was claimed for an earlier Sony ES CD player. But in this regards, the 9000ES is actually pretty straightforward in the analog section, aka simple, and in that regards very different from SCD-1/777ES, but neverthess (or alternately) similarly good sounding.
The alternative is they never did anything like true 1-bit all along, but that seems to be rewriting history. 1-bit arrived on the scene I think starting with Pioneer machines, and perhaps others, in the late 1980's with great fanfare. One trademark name was Bitstream, another was BASH. It may have been Sony, who had been plodding along with Philips chips, that was the follower, coming up with their own 1-bit chips to compete with Pioneer in pushing linearity beyond -80dB down to -100dB and maybe even -110dB and beyond. So the 1-bit race kept on during the 90's, with Sony ultimately achieving within 1dB of the theoretical possible CD THD+N performance with their 997ES and possibly 707ES and pusing linearity out to -110dB. Pioneer missed this considerably in their PD-75 but I'm not sure about the higher end ones.
But already, by the time say of the PD-S06, Pioneer seems to have been questioning 1-bit. That machine and an increasing number of later machines, switched to using full width multibit PCM chips like PCM56 and PCM63 especially. Or at least they were letting customers have a choice, as early as with the PD-93--which used PCM63, but I think Pioneer had completely abandoned 1-bit by the time of the Lipschutz papers.
But Sony had already tied its hands to 1-bit distribution (if not implementation) with SACD, and kept promoting 1-bit until the launch of SACD, which featured 1-bit based players, but were quietly walking back the principle superiority of SACD from being 1-bit to being, well, whatever the marketing agency could think of.
But why all this fuss and bother if the first SACD players sounded just fine? Well perhaps it's not just about making things good, it's about making them better and better year after year. It was too hard to keep making 1-bit better (through various trickery and overengineering?).
Anyway let me also say that DSD is perhaps not as bad as I've written about here-to-fore. While a pure 2.8Mhz 1-bit delta sigma system would be horribly information-lossy, and correspondingly noisy, that isn't at all like what DSD is. DSD achieves low audible noise, AND low information inefficiency, by noise shaping. Noise shaping is not just an add on, it is fundamentally what makes SACD possible, and that may be what I was not thinking about properly. It's improper perhaps to call DSD a Delta Sigma system, it's a Delta Sigma Noiseshaping system. Noiseshaping takes the place of the structuring imposed in PCM by coding. That's the difference here: noiseshaping vs coding.
Noiseshaping compensates for the information loss below 20kHz in a comparable 1-bit delta sigma system, and somewhat more. If it weren't for noiseshaping, the noise would be the clear sign of information loss. But actually DSD achives better noise performance than CD in the midband. Therefore, it is transmitting more information there.
And here, I'm not sure how to make the full comparison. One way would be to consider the full band 20-20k noise performance as showing the "information loss" of the system. And the other way would be to consider something like the A-weighted noise performance as showing "the effective information loss" of the system. By the latter measure for sure, and I'm uncertain for the former, DSD arguably achieves not information loss compared to 16bit 44.1kHz PCM, but information increase.
There's no question still that DSD is an inefficient system in transmitting information, PCM is far more efficient. But it could be said to be satisfactory in total information transmission, having reached CD quality and somewhat besting it, and having different character in the details, in ways that could be audibly pleasing.
Anyway, I'm thinking that the noise performance is a good indication of information performance, and one can just look at the noise curve of DSD and say it has more information than 44.1/16 in the midband. I'm only not sure to which the increasing noise in the upper octave cancels that anymore, it might not cancel all of the advantage. But anyway, the noiseshaping component means I can't simply apply my previous simplistic calculations. I have to account for the effect of the noise shaping in shifting available information bandwidth from super high frequencies down to useable ones (in the reverse direction to the noise). So another term might be information shifting.
And I wouldn't worry about the gaussian noise as much as tonal or correlated noise, and noise whose proporition increases or changes character at lower levels.
One way to explore "the details" in the sound would be to make a PCM or DSD recording at artificially low recorded level, say -60dB. Then playback with lots of gain, and see which sounds better. (This is actually more complicated than it sounds. Should the gain used be PCM-like or analog, for example.)
Of course we don't really have to be cynical to see that the big win for Sony with SACD/DSD would have been IP and the big industry selling point was DRM. Meanwhile I don't doubt that the lack of openness was a big hinderance in the end.
I don't think it's possible to consumers to make SACD's the way they could make CD-R's and CD-RW's, and that's part of the plan. Interesting that the earliest SACD machines from Sony may not even have supported CD-R and/or CD-RW, though most CD machines of the time did.
The lack of CD-R/RW capability may have been a subtle hint to industry. We own this, and we're not going to let consumers destroy your profit margins by permitting consumers to copy.
Anyway, WRT Sony, they visibly dropped any major concern for SACD and DSD sometime around 2006 when the Blu Ray vs DVD HD war was heating up. SACD had not gone the right way, they might have figured, but lessons had been learned to win the next corporate battle, which Sony did.
Something different anyway, perhaps not all to the good I'm wondering. Nowadays even Sony uses highly oversampled multibit DAC chips, sometimes even from the likes of Burr Brown, to implement SACD, just like everyone else. Back when the first SACD players were introduced, Sony was using their Pulse Converter chips, CXA8042AS. Those pulse converter chips are used in both the SCD-1/777 and the DVP-9000ES. Outside of that, and the use of one OPA213, the output circuits are quite different between the super high end and merely upscale, with the super high end showing far more additional stuff, and discrete circuits in the output.
A leading audio engineer, Stanley Lipschutz, and several others, revealed inherent faults in 1 bit delta sigma in AES papers in 2000, just after the public release of the SCD-1. The principle fault with 1 bit delta sigma is that the background noise isn't gaussian, it's tonal, with idle tones.
Sony then denied, to John Atkinson, that Sony was using 1 bit conversion, in response to a paper by David Rich describing the concerns of Lipschutz and others, published (of all places) in Stereophile.
I'd always wondered if that denial was with regards to future SACD, not past. In the past machinery, as in all the machines I just mentioned, still for sale but designed years earlier, they might have used 1-bit, but that was now water under the bridge.
I don't know enough about these CXA8042AS chips actually to confirm my version of this, that Sony abandoned true 1-bit when it became apparent that it didn't down scale well, or they hit a brick wall in making further improvements, or were covering up faults all along, or were embarassed by Lipschutz, or something like that. They had been able to achieve brilliant sound in the 3 early players by perhaps mixing things up a bit (especially in the SCD-1/777ES) so that the 1-bit fundamental character was somewhat obscured. I think they may have been further oversampling the 1-bit, IIRC 70Mhz was claimed for an earlier Sony ES CD player. But in this regards, the 9000ES is actually pretty straightforward in the analog section, aka simple, and in that regards very different from SCD-1/777ES, but neverthess (or alternately) similarly good sounding.
The alternative is they never did anything like true 1-bit all along, but that seems to be rewriting history. 1-bit arrived on the scene I think starting with Pioneer machines, and perhaps others, in the late 1980's with great fanfare. One trademark name was Bitstream, another was BASH. It may have been Sony, who had been plodding along with Philips chips, that was the follower, coming up with their own 1-bit chips to compete with Pioneer in pushing linearity beyond -80dB down to -100dB and maybe even -110dB and beyond. So the 1-bit race kept on during the 90's, with Sony ultimately achieving within 1dB of the theoretical possible CD THD+N performance with their 997ES and possibly 707ES and pusing linearity out to -110dB. Pioneer missed this considerably in their PD-75 but I'm not sure about the higher end ones.
But already, by the time say of the PD-S06, Pioneer seems to have been questioning 1-bit. That machine and an increasing number of later machines, switched to using full width multibit PCM chips like PCM56 and PCM63 especially. Or at least they were letting customers have a choice, as early as with the PD-93--which used PCM63, but I think Pioneer had completely abandoned 1-bit by the time of the Lipschutz papers.
But Sony had already tied its hands to 1-bit distribution (if not implementation) with SACD, and kept promoting 1-bit until the launch of SACD, which featured 1-bit based players, but were quietly walking back the principle superiority of SACD from being 1-bit to being, well, whatever the marketing agency could think of.
But why all this fuss and bother if the first SACD players sounded just fine? Well perhaps it's not just about making things good, it's about making them better and better year after year. It was too hard to keep making 1-bit better (through various trickery and overengineering?).
Anyway let me also say that DSD is perhaps not as bad as I've written about here-to-fore. While a pure 2.8Mhz 1-bit delta sigma system would be horribly information-lossy, and correspondingly noisy, that isn't at all like what DSD is. DSD achieves low audible noise, AND low information inefficiency, by noise shaping. Noise shaping is not just an add on, it is fundamentally what makes SACD possible, and that may be what I was not thinking about properly. It's improper perhaps to call DSD a Delta Sigma system, it's a Delta Sigma Noiseshaping system. Noiseshaping takes the place of the structuring imposed in PCM by coding. That's the difference here: noiseshaping vs coding.
Noiseshaping compensates for the information loss below 20kHz in a comparable 1-bit delta sigma system, and somewhat more. If it weren't for noiseshaping, the noise would be the clear sign of information loss. But actually DSD achives better noise performance than CD in the midband. Therefore, it is transmitting more information there.
And here, I'm not sure how to make the full comparison. One way would be to consider the full band 20-20k noise performance as showing the "information loss" of the system. And the other way would be to consider something like the A-weighted noise performance as showing "the effective information loss" of the system. By the latter measure for sure, and I'm uncertain for the former, DSD arguably achieves not information loss compared to 16bit 44.1kHz PCM, but information increase.
There's no question still that DSD is an inefficient system in transmitting information, PCM is far more efficient. But it could be said to be satisfactory in total information transmission, having reached CD quality and somewhat besting it, and having different character in the details, in ways that could be audibly pleasing.
Anyway, I'm thinking that the noise performance is a good indication of information performance, and one can just look at the noise curve of DSD and say it has more information than 44.1/16 in the midband. I'm only not sure to which the increasing noise in the upper octave cancels that anymore, it might not cancel all of the advantage. But anyway, the noiseshaping component means I can't simply apply my previous simplistic calculations. I have to account for the effect of the noise shaping in shifting available information bandwidth from super high frequencies down to useable ones (in the reverse direction to the noise). So another term might be information shifting.
And I wouldn't worry about the gaussian noise as much as tonal or correlated noise, and noise whose proporition increases or changes character at lower levels.
One way to explore "the details" in the sound would be to make a PCM or DSD recording at artificially low recorded level, say -60dB. Then playback with lots of gain, and see which sounds better. (This is actually more complicated than it sounds. Should the gain used be PCM-like or analog, for example.)
Of course we don't really have to be cynical to see that the big win for Sony with SACD/DSD would have been IP and the big industry selling point was DRM. Meanwhile I don't doubt that the lack of openness was a big hinderance in the end.
I don't think it's possible to consumers to make SACD's the way they could make CD-R's and CD-RW's, and that's part of the plan. Interesting that the earliest SACD machines from Sony may not even have supported CD-R and/or CD-RW, though most CD machines of the time did.
The lack of CD-R/RW capability may have been a subtle hint to industry. We own this, and we're not going to let consumers destroy your profit margins by permitting consumers to copy.
Anyway, WRT Sony, they visibly dropped any major concern for SACD and DSD sometime around 2006 when the Blu Ray vs DVD HD war was heating up. SACD had not gone the right way, they might have figured, but lessons had been learned to win the next corporate battle, which Sony did.
No comments:
Post a Comment