I found the article I was thinking of. Decide for yourself:
http://sound.westhost.com/cd-sacd-dvda.htm
Also, a few other bits and pieces (from
this link):
SACD is indeed fundamentally flawed. Using 1-bit as a conversion method can
be a valid choice when the analog circuit does not have performance higher
than the 1-bit signal. To use this as a data format, thus binding everyone
to the noise and distortion limits, is quite another thing...
SACD is a typically Japannish invention in that it is a solution to a
nonexistent problem (decimation-interpolation), which in turn creates some
very real problems left for real engineers to solve. Some examples:
1. Splicing (editing) two DSD signals together creates a "click", even if
both represent silence.
2. Any processing (except delay
results in a longer word length. Getting
back to 1-bit requires another stage of deltasigma modulation. Sony dreamt
of a new signal processing paradigm operating entirely in DSD. It was not to
be - they even officially admit it now. Any quantisation mixes the signal
with quantisation noise. They can no longer be separated. This is not much
of a problem at 24 bits. At 1 bit however... well...
3. The accumulated noise from previous conversions reduces the deltasigma
modulator's headroom. After 5 conversions (e.g. level control, eq, mixing,
fader etc), the modulator already overloads at silence.
4. DSD is not distortion-free.
5. The signal bandwidth and the noise zone overlap. In a correctly designed
converter, the signal occupies the "clean zone" only, thus allowing the
noise to be filtered away. With DSD, the noise zone starts at 20kHz but the
signal bandwidth extends -by Sony's definition- to 100kHz. The SNR over
100kHz is only 30dB. Many amplifiers produce audible distortions when
presented with this noise (hence the switchable filter on many SACDs).
It was "invented" when someone took a CS5390 chip, wired the 1-bit test
outputs straight to a D/A converter and liked what he heard. Thus, the
standard was fixed at 1-bit/64fs which happened to be the internal operating
parameters of this particular chip.
This chip is now long obsolete. Current ADCs operate at rates of 128fs and
over, at 4 bits or more.