Talk:Comparison of synchronous and asynchronous signalling
![]() | Computing Unassessed | |||||||||
|
The description of "synchronous transmission" provided in the article is inconsistent with various physical synchronous interfaces, e.g., SMPTE-310M. While a synchronizing signal is required (and provided) to receive the transmission, it need not travel "on another wire" -- the actual synchronizing signal can be embedded within the synchronous signal.
From SMPTE/EBU Task Force for Harmonized Standards for the Exchange of Program Material as Bitstreams[1]:
"Synchronous transmission" describes a transmission technique that requires a common clock signal (or timing reference) between two communicating devices to coordinate their transmissions. This common reference can be embedded within the signal, or can physically travel along with the signal on a similar or different medium.
"Asynchronous transmission" describes any transmission technique that does not require a common clock between the two communicating devices, but instead derives timing signals from special bits or characters (e.g. start/stop bits, flag characters) in the data stream itself. The essential characteristic of time-scales or signals such that their corresponding significant instants do not necessarily occur at the same average rate.
Perhaps confusing the issue are cases of non-linear signals, e.g., when compressed video is carried, wherein the video time base has an asynchronous relationship to the interface clocks.
- I'm confused. When I read that SMPTE/EBU document, I see the same definition for "asynchronous transmission", but I don't see any definition in that document for "synchronous transmission".
- Instead I see:
- Synchronous: A term used to describe a transmission technique that requires a common clock signal (or timing reference) between two communicating devices to coordinate their transmissions.
- ...
- E.3.3.3. Synchronization: Streaming data requires timing synchronization between the transmitters and receiver(s). This timing synchronization may be achieved through either the recovery of timing references embedded within the stream, or through the distribution of a system-wide clock to all participating devices.
- ...
- That document does talk a lot about embedding timing reference in the data stream.
- But I don't see anywhere that it defines that as "synchronous".
- I'm also going to argue that the "timing synchronization" mentioned can be achieved through timing references embedded within a asynchronous transmission stream.
- I can't find even one place where that document uses the phrase "common reference can be embedded within the signal".
- Therefore, I think "single wire synchronous transmission" should really be better classified as "asynchronous".
- If you expand the definition of "synchronous" to include not only two-signal protocols but also single-signal protocols, then what is left for the word "asynchronous" to describe?
- --68.0.124.33 (talk) 03:26, 7 May 2008 (UTC)
Let me explain my view, hope it helps:
The "asynchronous" character transmission is asynchronous with respect to the charcters, these may arrive at any time; there is no need to have a common clock. However, on the bit level, a short-term synchronisation is required, which is thus correctly termed "plesiochronous", which means nearly synchronous. Better would be piecemeal synchronous. Usual implementations have an interface, where just the binary values of the bits are transmitted, and the data terminal has the devices to extract the databits from this raw demodulated data stream. The charcaters are received practically at the same time the sender did send it, plus some delay for traveling (which can be seconds for satellite connections). Note that both sides have to agree on a timing parameter, i.e. the bit time.
The "synchronous" character transmission used in e.g. HDLC is more complicated, and uses bit-stuffing and byte-stuffing to ensure enough clock information so that the receiver can have use a clock that is synchronous, i.e. has exactly the same frequency for more than just a single character. When this transmission was used in computers, the modem recovered the clock, and uses two lines, a clock and a data line. Thus, we have an explicit clock synchronised with the sender on the interface. However, this was not really necessary, and I am not really shure it always was used, because the data terminal could also recover the clock from the de-modulated signal, as above. It had to do a similar operation anyhow on charcter-level, i.e. remove the stuffing bytes. And after removal of bit and byte stuffing, the byte stream of the HDLC synchronous transmission was as asynchronous as that of the previous case. Note that receiver could automatically determine the bit timing, even if normally this was a parameter to be set.
So the EBU document is not really a definition of terms, but highlights the features of their interest, which are also commonly used: In a synchronous system, you have a clock line; if not, it is asynchronous. Which is simple, but not really useful. Then the raw data stream of a synchronous modem would be asynchronous, as well as every optical connection, practically every serial communication that uses a single line only. But in fact e.g. Ethernet and most other newer serial communication systems use a synchronous method for a block of data, e.g. a phase-encoded signal, thus it is a block-synchronous system. And as far as I know, ADSL etc set up a synchronous clock as long as the modem is "online".
There is also a really asynchronous system, that because of the sloppy use of the term "asynchronous", calls itself "self-timing" or similar. In this technology, on bit level the reception of a bit is singalled on an extra return line (normally), so there is no time agreement at all.
To summarize, it must be said clearly on which level the transmission is synchronous or asynchronous: Bits, bytes, blocks ?