Jump to content

Talk:Direct-sequence spread spectrum

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Europrobe (talk | contribs) at 19:14, 2 August 2004 (harmonics). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

I am wondering why raising the bitrate "spreads the energy" on a wider frequency band. Does anyone have an idea?

Shannon's law gives us the maximum possible bit rate given a frequency band and a Signal to Noise ratio: MaxBitRate = Bandwidth * log2 (1 + SignalPower/NoisePower)

But I don't see that as an explanation since it allow us to calculate the MAX bitrate, not the bitrate itself.

Then there's Nyquist's law : the sampling rate must be at least twice the signal's max frequency. I don't see this as an explanation either.

One explanation I can imagine is that switching abruptly a signal from one frequency to another generates harmonics. If you switch very often, a greater part of the energy is used up by those harmonics. Does this even make sense?  ;-)

Anyway, does anyone have any good explanation for this rule?


If you assume that you always use close to maximum bitrate at a given bandwidth (which makes sense, since otherwise you'd be wasting spectrum), you see that it's only by increasing bandwidth that you can increase the bitrate (or baud rate, really). Have a look at modem, Quadrature amplitude modulation and some of their linked pages for more info.

europrobe 08:50, 2004 Aug 1 (UTC)


Thank you for your answer, I appreciate your help. But IMHO, I feel that this does not really answer the question. True, Shannon's law tells us that with a wider bandwidth, you can get a higher channel capacity. So I agree with you that in order to get a higher bitrate, you need to use a larger bandwidth (or raise the S/N ratio exponentially). But what I do not understand is why raising the bitrate spreads the signal energy on a wider frequency band (as the DSSS article says). What is the physicle explanation for this? Let me summarize this:

  • A wider band allows for a greater bitrate.
  • But why does a greater bitrate cause a wider band?

Thanks again for you help.

--Ageron 10:32, 1 Aug 2004 (UTC)


Seems like this is more an issue of semantics. A wider band allows for a higher bitrate, but is not (to my knowledge), actually caused by it. –radiojon 00:39, 2004 Aug 2 (UTC)


Royal Well, this is a chicken-and-egg problem. To send at a higher bitrate you need to allow the signal to use more bandwidth. A wider bandwidth does not "create" a higher bitrate. (Digital) frequency modulation, for example, uses two or more frequencies to send information. The more frequencies, the wider the band. Also, rapidly changing between the frequencies creates harmonics, which widen the band. Amplitude modulation uses changes in amplitude (as you no doubt are aware of), and rapid changes in amplitude also creates harmonics. Other modulations are just variations of these, so the same limitations apply. In other words, a high bitrate actually cause the signal to spead out due to harmonics and multiple frequencies being used.

europrobe 19:14, 2004 Aug 2 (UTC)