Jump to content

Talk:Direct-sequence spread spectrum

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by 82.224.127.182 (talk) at 20:34, 31 July 2004 (Why Higher bitrate <=> Energy spread on larger frequency band ?). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

I am wondering why raising the bitrate "spreads the energy" on a wider frequency band. Does anyone have an idea?

Shannon's law gives us the maximum possible bit rate given a frequency band and a Signal to Noise ratio: MaxBitRate = Bandwidth * log2 (1 + SignalPower/NoisePower)

But I don't see that as an explanation since it allow us to calculate the MAX bitrate, not the bitrate itself.

Then there's Nyquist's law : the sampling rate must be at least twice the signal's max frequency. I don't see this as an explanation either.

One explanation I can imagine is that switching abruptly a signal from one frequency to another generates harmonics. If you switch very often, a greater part of the energy is used up by those harmonics. Does this even make sense?  ;-)

Anyway, does anyone have any good explanation for this rule?