RFI

 

A computer uses all kind of clocks, CPU, PCI, USB; all are driven by a clock.
Of course our  logical 1 and 0 only exist in our mind. Inside the computer they are represented by switching the power on and off.
Like all switching devices, this generates RFI.
In the process, harmonics are generated, a bit like an acoustical instrument. You play a A and the due to vibration of the instrument, harmonics higher in frequency are generated.

 

I'm not an expert on the subject but here is my very simplified explanation.

Lets take a continuous square wave with a 50% duty cycle (otherwise known as a "clock"). Whats on the wire (ignoring radiated signal for now) is a series of harmonics (mostly odd). The power of each harmonic is determined by the rise time of the clock. Fast rise times give more power to higher harmonics. The ratio of power given to the harmonics is not related to the absolute value of the risetime but to the ratio of the risetime to the clock period.

Thus in a real world situation where the rise time stays fixed the amount of power in the harmonics actually goes down as the frequency goes up. At first this sounds like it can't be but it makes sense when you look at what happens to a clock with a fixed risetime as you vary frequency.

Remember that a pure sine wave has no harmonics. Harmonics arise when you have things changing value faster than the sine wave. The further they are from looking like a sine wave the stronger the harmonics.

So lets look at the clock. Zoom in at the edges and you see that they actually look like "S" curves. In between the "S" curves are flat sections. Lets increase the frequency, the S curves stay the same but the time spent in the flat sections decrease. As the frequency gets higher and higher eventually you have no flat sections left, the signal goes from one S curve to the other. What does that look like? A sine wave. Sine waves have no harmonics. There you have it,a hopefully understandable explanation of why as you increase clock speed the amount of power in the harmonics goes down.

But with higher frequencies the harmonics are more spread out. Say for a 1 MHz clock you have stuff at 1, 3, 5, 7, 9 .... With a 10MHz clock you have nothing below 10MHz. You get 10,30, 50, 70 ... So whats the intensity of the harmonics? The lower clock has a higher density of harmonics, the higher clock has big holes where there is nothing. Even though the 5th harmonic of 10 is less than the 5th harmonic of 1, at 50 MHz the the harmonic 5th harmonic of 10 is quite a bit stronger than the 50th harmonic of 1.

So whether this is going to be a problem depends a lot on what you are measuring. If your system is equally sensitve to all frequencies from DC to infinity the high frequency clock has less effect. If you have a sensitivity to lower frequencies only the low frequency clock might have more of an effect.

The low frequency clock has more energy in lower frequencies and the density is greater. The high frequency clock is more spread out but has big holes. If your range of sensitivity is in one of those holes you might get nothing.

The other aspect of this is that the above just covers whats in the wire, the other aspect is that in order to be RFI it has to be radiated from the wire. This gets into all the complexity of antenna theory, what goes out into space is going to be very dependant on the size and geometry of the wire not just the "power level" of the signal.

And of course computers don't have just one clock, there are MANY signals running around at submultiples of the clock, each of which is going to have its own series of harmonics.

It gets rather complicated rather quickly. The upshot is that its not easy to figure out what the "radiated RFI" is going to be as you change clock frequency. And whether this RFI is going to cause any problems depends a lot on the sensitivities of the system to different frequencies.

John Swenson