Talk:Direct-sequence spread spectrum
I am wondering why raising the bitrate "spreads the energy" on a wider frequency band. Does anyone have an idea?
Shannon's law gives us the maximum possible bit rate given a frequency band and a Signal to Noise ratio: MaxBitRate = Bandwidth * log2 (1 + SignalPower/NoisePower)
But I don't see that as an explanation since it allow us to calculate the MAX bitrate, not the bitrate itself.
Then there's Nyquist's law : the sampling rate must be at least twice the signal's max frequency. I don't see this as an explanation either.
One explanation I can imagine is that switching abruptly a signal from one frequency to another generates harmonics. If you switch very often, a greater part of the energy is used up by those harmonics. Does this even make sense? ;-)
Anyway, does anyone have any good explanation for this rule?