這篇是非常有用的資料, 建議有興趣的讀一下
http://www.audioasylum.com/audio/tweaks ... /1582.html
*How can digital interconnect cables from a CD transport to an outboard DAC possibly affect the sound? Isn't digital audio just one's and zero's?
A very common misconception about digital signal
transmission with respect to audio is that if the signal does
not get corrupted to the point of losing or changing the 1's
and 0's, that nothing else can go wrong. If the transmission
system had been designed with cost no object, and by
engineers familiar with all the known foibles and problems
of digital transmission of audio signals, then this might
be subtantially true. No differences could rear their ugly
head.
Unfortunately, the systems we ended up with DO NOT remain
unaffected by such things as jitter, where the transistion
from a 1 to a 0 is modulated with respect to time. There are
many ways that jitter can affect the final digtial to analog
conversion at the DAC. Jitter on the transmitted signal can
bleed or feed through the input reciever, and affect the DAC.
How? Current drain on the power supplies due to the changing
signal content and the varying demands made on the power
supply to the logic chips and the DAC. Modulate the power
supply rails, and the DAC will convert at slightly different
times. This is due to the fact that a logical one or zero is detected by the signal swinging through a regoin from around zero volts to around 5 volts. The digital logic chips detect the change at a specific PERCENTAGE of that total voltage swing.
HOWEVER the power supply gets modulated, it will
affect the DAC. One version of this has been popularly
refered to as LIM or Logic Induced Modulation by the
audiophile press. See:
"Time Distortions Within Digital Audio Equipment Due to Integrated Circuit Logic Induced Modulation Products"
AES Preprint Number: 3105 Convention: 91 1991-10
Authors: Edmund Meitner & Robert Gendron
Many of the logic chips in a digital audio system behave
very poorly with respect to dumping garbage onto the rails
and even worse, onto the ground reference point. Even as I
post, logic manufacturers such as TI are advertising the
benefits of their latest generation of logic chips that
reduce ground bounce. The circuitry itself generates it's
own interference, and this can be modulated by almost
anything that also affects the power supply or ground.
The amount of jitter that it takes to affect the analog
output of the signal used to be thought of as fairly high,
somewhere on the order of 1,000 to 500 pS worth. Now, the
engineers on the cutting edge claim that in order for jitter
to be inaudible and not affect the sound of the signal, it
may have to be as low as 10 to 20 pS. That's for 16 bit
digital audio. That's a very tiny amount of jitter, and
easily below what most all current equipment is capable of.
Computer systems never convert the 1's and 0's to time
sensitive analog data, they only need to recover the 1's
or 0's, any timing accuracy only has to preserve the bits,
not how accurately they arrive or are delivered. So in this
regard, computer systems ARE completely different than
digital audio systems.
Look into digital audio more thouroughly, and realize that
the implementations are not perfect or ideal, and are
sensitive to outside influences. Just because they could
have been and should have been done better or more nearly
perfect does not mean they were! People are not hearing
things, they are experiencing the result of products designed
to a cost point that perform the way they do in a real
world because of design limitations imposed by the consumer
market price conciousness all the mid-fi companies live and
die by.
With digital cables, there are three things that are paramount:
proper impedance, proper cable termination, and wide bandwidth.
It may be that a particular cable more nearly matches a systems
actual impedance. The other factor, proper termination includes,
but is not limited to, the actual electrical termination inside
the components, as well as the connector on the end of the
cable. If the connector is NOT a perfect 75 ohm, 110 ohm, or
whatever, it will cause minor reflections in the cable, which
makes our old friend JITTER raise it's ugly head again.
The third factor, bandwidth, is only an issue because both the
AES/EBU and the SP/DIF interface formats were designed before
Sony/Phillips knew all there was to know about digital problems,
and they require PERFECT unlimited bandwidth cables in order for
the transimission systems to be free of jitter. The more you limit
the bandwidth, the more jitter. This is a known engineering
fact, and an AES paper was given about this very subject not
too long ago.
"Is the AES/EBU/SPDIF Digital Audio Interface Flawed?"
AES Preprint Number:3360 Convention: 93 1992-10,
Authors: Chris Dunn & Malcolm O. J. Hawksford
The effective data rate of SP/DIF is about 3 Mhz, and the
design of the transmitters and recievers is abysmal. Maybe
if everything else was done right, then cables, etc. wouldn't
matter. So much was done wrong or cost cut till it screwed
up that they do come into the picture.
A good web source for info on jitter is located at:
http://www.digido.com/jitteressay.html
AND:
http://www.audioprecision.com/publicati ... tm#Digital Audio Transmission
I recommend a couple of options for digital interconnect cables in my DIY Interconnect Note.