
December 2nd 03, 09:15 AM
posted to uk.rec.audio
|
|
Co-ax SPDIF digital out
In article , Ian Molton
wrote:
On Mon, 01 Dec 2003 23:20:58 GMT "Bedouin"
wrote:
I'm not sure what part of the "square wave" a DAC typically detects.
If it is a zero crossing point which in many ways seems the most
sensible then loss of the high frequencies will have little effect, if
it works on a point away from the zero crossing then loss of the high
frequencies could be very significant.
If the jitter is considered noise, and thus phaseless, then it may have
no effect on the reconstruction at all, as 16 bits are needed to form a
sample and the jitter may very well totally cancel itself out in that
time (plus there plenty of time for a PLL to smooth things over there
AIUI.
This should be so *if* the jitter is unrelated to the signal, and *if* a
PLL is used that has a long enough smoothing time to suppress any clock
problems due to medium term and short term jitter induced by shape changes.
The problem (in principle) is that detected jitter may be signal dependent
as it stems from the changed shape of the waveform and the effect this
might have upon the instants at which the receiver decides it has seen a
transition. Also, I am not sure that all DAC/receivers have effective PLLs.
TBH though I recon USB audio would be a decent alternative to all this
unidirectional crap ;-)
I have the feeling that we have had this discussion before, elsewhere...
:-)
Slainte,
Jim
--
Electronics http://www.st-and.ac.uk/~www_pa/Scot...o/electron.htm
Audio Misc http://www.st-and.demon.co.uk/AudioMisc/index.html
Armstrong Audio http://www.st-and.demon.co.uk/Audio/armstrong.html
Barbirolli Soc. http://www.st-and.demon.co.uk/JBSoc/JBSoc.html
|

December 2nd 03, 10:40 AM
posted to uk.rec.audio
|
|
Co-ax SPDIF digital out
Stewart Pinkerton wrote:
On Mon, 01 Dec 2003 23:20:58 GMT, "Bedouin"
wrote:
"Jim Lesurf" wrote in message
...
In article ,
"Bedouin" In principle, losing some of the 'squareness' of the edges
should not matter if the receiver can still read the data OK. Although
there may be a possible effect of 'data induced jitter' of the kind that
Julian Dunn and others have discussed in the past. In practice I have my
doubts about this being a serious problem, though, in most casts.
It was the introduction of Jitter due to the rounding of the signal that I
was thinking about.
I'm not sure what part of the "square wave" a DAC typically detects. If
it is a zero crossing point which in many ways seems the most sensible
then loss of the high frequencies will have little effect, if it works on
a point away from the zero crossing then loss of the high frequencies
could be very significant.
The point is not the nominal value of the trigger voltage, but the
*slope* of the edges of the 'square' wave. Many poorer S/PDIF input
receivers have trigger points which are a simple fraction of the rail
voltage, so that noise on the rail voltage directly affects the
trigger point. It follows that jitter is directly proportional to the
slope of edge, since an instantaneous transition (i.e. a truly
vertical edge) would cause no jitter. How much of this jitter is
*audible*, is another question, but it's a problem which was solved
many decades ago in the comms industry. As ever, so-called 'high end'
audio is decades out of date with real engineering practice.
Interesting but completely wrong. The SPDIF first has to be deserialised.
The data has the clock embedded in it which first has to be recovered
before the data can be reclocked and fed into a shift register. A PLL is
used to recover clock and jitter in the SPDIF clock edges will have little
effect on this as the PLL time constant is much longer than the jitter.
All that happens is that when the bit is sampled (in the middle not the
edge) the jitter causes the S/N ratio to decrease which means there is an
increasing probability of a wrong bit being detected. Once a word of bits
has been assembled it is then clocked into a DAC in parallel. The only
jitter therefore in the output analogue signal is that introduce by the
internal clock used to clock the word into the DAC. Jitter in the SPDIF
signal is not present in the output.
Ian
|

December 2nd 03, 10:40 AM
posted to uk.rec.audio
|
|
Co-ax SPDIF digital out
Stewart Pinkerton wrote:
On Mon, 01 Dec 2003 23:20:58 GMT, "Bedouin"
wrote:
"Jim Lesurf" wrote in message
...
In article ,
"Bedouin" In principle, losing some of the 'squareness' of the edges
should not matter if the receiver can still read the data OK. Although
there may be a possible effect of 'data induced jitter' of the kind that
Julian Dunn and others have discussed in the past. In practice I have my
doubts about this being a serious problem, though, in most casts.
It was the introduction of Jitter due to the rounding of the signal that I
was thinking about.
I'm not sure what part of the "square wave" a DAC typically detects. If
it is a zero crossing point which in many ways seems the most sensible
then loss of the high frequencies will have little effect, if it works on
a point away from the zero crossing then loss of the high frequencies
could be very significant.
The point is not the nominal value of the trigger voltage, but the
*slope* of the edges of the 'square' wave. Many poorer S/PDIF input
receivers have trigger points which are a simple fraction of the rail
voltage, so that noise on the rail voltage directly affects the
trigger point. It follows that jitter is directly proportional to the
slope of edge, since an instantaneous transition (i.e. a truly
vertical edge) would cause no jitter. How much of this jitter is
*audible*, is another question, but it's a problem which was solved
many decades ago in the comms industry. As ever, so-called 'high end'
audio is decades out of date with real engineering practice.
Interesting but completely wrong. The SPDIF first has to be deserialised.
The data has the clock embedded in it which first has to be recovered
before the data can be reclocked and fed into a shift register. A PLL is
used to recover clock and jitter in the SPDIF clock edges will have little
effect on this as the PLL time constant is much longer than the jitter.
All that happens is that when the bit is sampled (in the middle not the
edge) the jitter causes the S/N ratio to decrease which means there is an
increasing probability of a wrong bit being detected. Once a word of bits
has been assembled it is then clocked into a DAC in parallel. The only
jitter therefore in the output analogue signal is that introduce by the
internal clock used to clock the word into the DAC. Jitter in the SPDIF
signal is not present in the output.
Ian
|

December 2nd 03, 04:18 PM
posted to uk.rec.audio
|
|
Co-ax SPDIF digital out
On Tue, 02 Dec 2003 10:40:52 +0000, Ian Bell
wrote:
Stewart Pinkerton wrote:
On Mon, 01 Dec 2003 23:20:58 GMT, "Bedouin"
wrote:
"Jim Lesurf" wrote in message
...
In article ,
"Bedouin" In principle, losing some of the 'squareness' of the edges
should not matter if the receiver can still read the data OK. Although
there may be a possible effect of 'data induced jitter' of the kind that
Julian Dunn and others have discussed in the past. In practice I have my
doubts about this being a serious problem, though, in most casts.
It was the introduction of Jitter due to the rounding of the signal that I
was thinking about.
I'm not sure what part of the "square wave" a DAC typically detects. If
it is a zero crossing point which in many ways seems the most sensible
then loss of the high frequencies will have little effect, if it works on
a point away from the zero crossing then loss of the high frequencies
could be very significant.
The point is not the nominal value of the trigger voltage, but the
*slope* of the edges of the 'square' wave. Many poorer S/PDIF input
receivers have trigger points which are a simple fraction of the rail
voltage, so that noise on the rail voltage directly affects the
trigger point. It follows that jitter is directly proportional to the
slope of edge, since an instantaneous transition (i.e. a truly
vertical edge) would cause no jitter. How much of this jitter is
*audible*, is another question, but it's a problem which was solved
many decades ago in the comms industry. As ever, so-called 'high end'
audio is decades out of date with real engineering practice.
Interesting but completely wrong. The SPDIF first has to be deserialised.
The data has the clock embedded in it which first has to be recovered
before the data can be reclocked and fed into a shift register. A PLL is
used to recover clock and jitter in the SPDIF clock edges will have little
effect on this as the PLL time constant is much longer than the jitter.
Interesting, but completely wrong....................
Any jitter which is present in the incoming datastream will be
attenuated very roughly by the ratio of the main jitter frequency
(commonly 50/100Hz from the power rails) to the PLL time constant
(commonly 10Hz or so to quickly achieve lock) . As you can see, this
isn't much in the way of attenuation! A few of the better DACs
incorporate double PLLs with a secondary loop having a time constant
of a second or so, allowing much better jitter suppression, but the
basic fact remains that jitter in the incoming datastream can only be
*reduced* by the PLL, not eliminated. For that, you require to reclock
the datastream from an independent free-running low-noise clock, and
this is *very* rare.
All that happens is that when the bit is sampled (in the middle not the
edge)
Again, completely wrong, as it's the edge transition which is
detected.
the jitter causes the S/N ratio to decrease which means there is an
increasing probability of a wrong bit being detected.
Excuse me? Noise immunity is a *completely* separate issue from jitter
effects, the only common ground being that either can cause bits to be
dropoped in extreme cases. OTOH, I have *never* heard of dropped bits
occurring in a domestic transport/DAC situation.
Once a word of bits
has been assembled it is then clocked into a DAC in parallel. The only
jitter therefore in the output analogue signal is that introduce by the
internal clock used to clock the word into the DAC. Jitter in the SPDIF
signal is not present in the output.
Absolute garbage, since that 'internal clock' is a PLL which is slaved
to the incoming data stream. You do know what 'slaved' means, I trust?
As noted, there are a very few DACs to which the above does not apply,
since they genuinely reclock the data, but in the vast majority of
cases, output jitter is directly proportional to input jitter.
--
Stewart Pinkerton | Music is Art - Audio is Engineering
|

December 2nd 03, 04:18 PM
posted to uk.rec.audio
|
|
Co-ax SPDIF digital out
On Tue, 02 Dec 2003 10:40:52 +0000, Ian Bell
wrote:
Stewart Pinkerton wrote:
On Mon, 01 Dec 2003 23:20:58 GMT, "Bedouin"
wrote:
"Jim Lesurf" wrote in message
...
In article ,
"Bedouin" In principle, losing some of the 'squareness' of the edges
should not matter if the receiver can still read the data OK. Although
there may be a possible effect of 'data induced jitter' of the kind that
Julian Dunn and others have discussed in the past. In practice I have my
doubts about this being a serious problem, though, in most casts.
It was the introduction of Jitter due to the rounding of the signal that I
was thinking about.
I'm not sure what part of the "square wave" a DAC typically detects. If
it is a zero crossing point which in many ways seems the most sensible
then loss of the high frequencies will have little effect, if it works on
a point away from the zero crossing then loss of the high frequencies
could be very significant.
The point is not the nominal value of the trigger voltage, but the
*slope* of the edges of the 'square' wave. Many poorer S/PDIF input
receivers have trigger points which are a simple fraction of the rail
voltage, so that noise on the rail voltage directly affects the
trigger point. It follows that jitter is directly proportional to the
slope of edge, since an instantaneous transition (i.e. a truly
vertical edge) would cause no jitter. How much of this jitter is
*audible*, is another question, but it's a problem which was solved
many decades ago in the comms industry. As ever, so-called 'high end'
audio is decades out of date with real engineering practice.
Interesting but completely wrong. The SPDIF first has to be deserialised.
The data has the clock embedded in it which first has to be recovered
before the data can be reclocked and fed into a shift register. A PLL is
used to recover clock and jitter in the SPDIF clock edges will have little
effect on this as the PLL time constant is much longer than the jitter.
Interesting, but completely wrong....................
Any jitter which is present in the incoming datastream will be
attenuated very roughly by the ratio of the main jitter frequency
(commonly 50/100Hz from the power rails) to the PLL time constant
(commonly 10Hz or so to quickly achieve lock) . As you can see, this
isn't much in the way of attenuation! A few of the better DACs
incorporate double PLLs with a secondary loop having a time constant
of a second or so, allowing much better jitter suppression, but the
basic fact remains that jitter in the incoming datastream can only be
*reduced* by the PLL, not eliminated. For that, you require to reclock
the datastream from an independent free-running low-noise clock, and
this is *very* rare.
All that happens is that when the bit is sampled (in the middle not the
edge)
Again, completely wrong, as it's the edge transition which is
detected.
the jitter causes the S/N ratio to decrease which means there is an
increasing probability of a wrong bit being detected.
Excuse me? Noise immunity is a *completely* separate issue from jitter
effects, the only common ground being that either can cause bits to be
dropoped in extreme cases. OTOH, I have *never* heard of dropped bits
occurring in a domestic transport/DAC situation.
Once a word of bits
has been assembled it is then clocked into a DAC in parallel. The only
jitter therefore in the output analogue signal is that introduce by the
internal clock used to clock the word into the DAC. Jitter in the SPDIF
signal is not present in the output.
Absolute garbage, since that 'internal clock' is a PLL which is slaved
to the incoming data stream. You do know what 'slaved' means, I trust?
As noted, there are a very few DACs to which the above does not apply,
since they genuinely reclock the data, but in the vast majority of
cases, output jitter is directly proportional to input jitter.
--
Stewart Pinkerton | Music is Art - Audio is Engineering
|

December 2nd 03, 04:29 PM
posted to uk.rec.audio
|
|
Co-ax SPDIF digital out
On Tue, 2 Dec 2003 16:18:34 +0000 (UTC)
(Stewart Pinkerton) wrote:
For that, you require to reclock
the datastream from an independent free-running low-noise clock, and
this is *very* rare.
the important detail being that unless you have BI-DIRECTIONAL data transfer you can never reliably re-clock the data, unless you have *gargantuan* buffers (mind you with todays memory prices...)
the point is that in a continuous stream, sooner or later you will have a problem as the free running local DAC wont be running at the same speed as the data stream, so it will, eventually either over- or under- run its data buffer and have either some skipped or lost data, or some 'dodgy' compensation scheme.
whats needed to do it *right* is a data stream that can deliver the data *faster* than its needed, and a source that can pause its transfer.
then the source can transfer at full speed to the DACs buffers, and the DAC can say 'no more please I'm full'.
then as the buffer drains the DAC can request more data.
this utterly eliminates jitter and frankly Im amazed the high-end HiFi audio industry hasnt cottoned on to the idea yet, if not because its audibly better, but because it means they can sell a whole load more interconnects and new DACs etc.
--
Spyros lair: http://www.mnementh.co.uk/ |||| Maintainer: arm26 linux
Do not meddle in the affairs of Dragons, for you are tasty and good with ketchup.
|

December 2nd 03, 04:29 PM
posted to uk.rec.audio
|
|
Co-ax SPDIF digital out
On Tue, 2 Dec 2003 16:18:34 +0000 (UTC)
(Stewart Pinkerton) wrote:
For that, you require to reclock
the datastream from an independent free-running low-noise clock, and
this is *very* rare.
the important detail being that unless you have BI-DIRECTIONAL data transfer you can never reliably re-clock the data, unless you have *gargantuan* buffers (mind you with todays memory prices...)
the point is that in a continuous stream, sooner or later you will have a problem as the free running local DAC wont be running at the same speed as the data stream, so it will, eventually either over- or under- run its data buffer and have either some skipped or lost data, or some 'dodgy' compensation scheme.
whats needed to do it *right* is a data stream that can deliver the data *faster* than its needed, and a source that can pause its transfer.
then the source can transfer at full speed to the DACs buffers, and the DAC can say 'no more please I'm full'.
then as the buffer drains the DAC can request more data.
this utterly eliminates jitter and frankly Im amazed the high-end HiFi audio industry hasnt cottoned on to the idea yet, if not because its audibly better, but because it means they can sell a whole load more interconnects and new DACs etc.
--
Spyros lair: http://www.mnementh.co.uk/ |||| Maintainer: arm26 linux
Do not meddle in the affairs of Dragons, for you are tasty and good with ketchup.
|

December 2nd 03, 07:36 PM
posted to uk.rec.audio
|
|
Co-ax SPDIF digital out
Stewart Pinkerton wrote:
On Tue, 02 Dec 2003 10:40:52 +0000, Ian Bell
wrote:
Interesting but completely wrong. The SPDIF first has to be deserialised.
The data has the clock embedded in it which first has to be recovered
before the data can be reclocked and fed into a shift register. A PLL is
used to recover clock and jitter in the SPDIF clock edges will have little
effect on this as the PLL time constant is much longer than the jitter.
Interesting, but completely wrong....................
Any jitter which is present in the incoming datastream will be
attenuated very roughly by the ratio of the main jitter frequency
(commonly 50/100Hz from the power rails) to the PLL time constant
(commonly 10Hz or so to quickly achieve lock) . As you can see, this
isn't much in the way of attenuation! A few of the better DACs
incorporate double PLLs with a secondary loop having a time constant
of a second or so, allowing much better jitter suppression, but the
basic fact remains that jitter in the incoming datastream can only be
*reduced* by the PLL, not eliminated. For that, you require to reclock
the datastream from an independent free-running low-noise clock, and
this is *very* rare.
Rare it may be but that was precisely what i said.
All that happens is that when the bit is sampled (in the middle not the
edge)
Again, completely wrong, as it's the edge transition which is
detected.
Not completely wrong. SPDIF is FM encoded. The clock is encoded as a
transition at the edge of each bit cell. For data, zeros are encoded with
no transition in the centre of the bit cell and one with an extra
transition in the centre of the bit cell.
the jitter causes the S/N ratio to decrease which means there is an
increasing probability of a wrong bit being detected.
Excuse me? Noise immunity is a *completely* separate issue from jitter
effects, the only common ground being that either can cause bits to be
dropoped in extreme cases.
Amplitude and temporal noise (jitter) both affect the probability of the
correct decode of a bit and therefore the S/N ratio. No matter what the
S/N ratio, with gaussian noise, there is always a finite probabilty the bit
will be wrongly decoded
OTOH, I have *never* heard of dropped bits
occurring in a domestic transport/DAC situation.
We are not talking about a transport/DAC situation an SPDIF connection.
SPDIF has *no* error detection and correction so bits *will* be dropped.
Once a word of bits
has been assembled it is then clocked into a DAC in parallel. The only
jitter therefore in the output analogue signal is that introduce by the
internal clock used to clock the word into the DAC. Jitter in the SPDIF
signal is not present in the output.
Absolute garbage, since that 'internal clock' is a PLL which is slaved
to the incoming data stream. You do know what 'slaved' means, I trust?
I am well aware of the term slaved and if you had read my post more closely
you would have realised I was *not* referring to a slaved clock.
As noted, there are a very few DACs to which the above does not apply,
since they genuinely reclock the data, but in the vast majority of
cases, output jitter is directly proportional to input jitter.
This all assumes we are discussing an SPDIF connection to a DAC. For an
SPDIF connection to a digital recorder for example any jitter in the SPDIF
signal is irrelevant. As I have said many times in the past, the only
jitter that is important is that introduced by the final DAC and more often
that not that is not connected to an SPDIF stream.
Ian
|

December 2nd 03, 07:36 PM
posted to uk.rec.audio
|
|
Co-ax SPDIF digital out
Stewart Pinkerton wrote:
On Tue, 02 Dec 2003 10:40:52 +0000, Ian Bell
wrote:
Interesting but completely wrong. The SPDIF first has to be deserialised.
The data has the clock embedded in it which first has to be recovered
before the data can be reclocked and fed into a shift register. A PLL is
used to recover clock and jitter in the SPDIF clock edges will have little
effect on this as the PLL time constant is much longer than the jitter.
Interesting, but completely wrong....................
Any jitter which is present in the incoming datastream will be
attenuated very roughly by the ratio of the main jitter frequency
(commonly 50/100Hz from the power rails) to the PLL time constant
(commonly 10Hz or so to quickly achieve lock) . As you can see, this
isn't much in the way of attenuation! A few of the better DACs
incorporate double PLLs with a secondary loop having a time constant
of a second or so, allowing much better jitter suppression, but the
basic fact remains that jitter in the incoming datastream can only be
*reduced* by the PLL, not eliminated. For that, you require to reclock
the datastream from an independent free-running low-noise clock, and
this is *very* rare.
Rare it may be but that was precisely what i said.
All that happens is that when the bit is sampled (in the middle not the
edge)
Again, completely wrong, as it's the edge transition which is
detected.
Not completely wrong. SPDIF is FM encoded. The clock is encoded as a
transition at the edge of each bit cell. For data, zeros are encoded with
no transition in the centre of the bit cell and one with an extra
transition in the centre of the bit cell.
the jitter causes the S/N ratio to decrease which means there is an
increasing probability of a wrong bit being detected.
Excuse me? Noise immunity is a *completely* separate issue from jitter
effects, the only common ground being that either can cause bits to be
dropoped in extreme cases.
Amplitude and temporal noise (jitter) both affect the probability of the
correct decode of a bit and therefore the S/N ratio. No matter what the
S/N ratio, with gaussian noise, there is always a finite probabilty the bit
will be wrongly decoded
OTOH, I have *never* heard of dropped bits
occurring in a domestic transport/DAC situation.
We are not talking about a transport/DAC situation an SPDIF connection.
SPDIF has *no* error detection and correction so bits *will* be dropped.
Once a word of bits
has been assembled it is then clocked into a DAC in parallel. The only
jitter therefore in the output analogue signal is that introduce by the
internal clock used to clock the word into the DAC. Jitter in the SPDIF
signal is not present in the output.
Absolute garbage, since that 'internal clock' is a PLL which is slaved
to the incoming data stream. You do know what 'slaved' means, I trust?
I am well aware of the term slaved and if you had read my post more closely
you would have realised I was *not* referring to a slaved clock.
As noted, there are a very few DACs to which the above does not apply,
since they genuinely reclock the data, but in the vast majority of
cases, output jitter is directly proportional to input jitter.
This all assumes we are discussing an SPDIF connection to a DAC. For an
SPDIF connection to a digital recorder for example any jitter in the SPDIF
signal is irrelevant. As I have said many times in the past, the only
jitter that is important is that introduced by the final DAC and more often
that not that is not connected to an SPDIF stream.
Ian
|

December 2nd 03, 08:00 PM
posted to uk.rec.audio
|
|
Co-ax SPDIF digital out
On Tue, 02 Dec 2003 09:15:24 +0000 (GMT)
Jim Lesurf wrote:
TBH though I recon USB audio would be a decent alternative to all this
unidirectional crap ;-)
I have the feeling that we have had this discussion before, elsewhere...
:-)
Yeah I was thinking that ;-)
One thing we never did resolve though...
Why *is* audio digital transport done in a way that has no feedback?
--
Spyros lair: http://www.mnementh.co.uk/ |||| Maintainer: arm26 linux
Do not meddle in the affairs of Dragons, for you are tasty and good with ketchup.
|
Thread Tools |
|
Display Modes |
Linear Mode
|
|