Recently, I used a shielded coaxial cable to measure power supply ripple and the results were even noisier than using an oscilloscope probe. How can that happen? Doesn’t the shielding reduce noise? I even measured my setup using a signal generator and it all looks ok. So what was wrong?
On the surface, it seems like a great idea to use a shielded cable to measure noise. In reality, it is a great idea to use a coaxial cable to measure ripple and noise and you can gain the benefit of the shield. Another benefit is the optimization of the measurement’s SNR (signal-to-noise ratio), which is obtained from using a unity-gain probe. Yet another benefit is that the resulting measurement can generally support a wider bandwidth than most probes and at a much lower cost. The measurement execution, however, requires some care.
The nature of coaxial cables The coaxial cable is a transmission line, designed to have a specific impedance, usually 50 ohm, though there are other standard impedances, such as 75 ohm. This cable should connect to an identical impedance. In most test instruments, the signal outputs present a 50-ohm output impedance while the instrument inputs present a 50-ohm input impedance. This well-matched system results in a very wide bandwidth, flat response measurement. When this system isn’t well matched, things tend to go astray.
I made five oscilloscope measurements with a 50-ohm signal generator (Picotest G5100A) connected to an oscilloscope input through a 1-m long RG174 (50 ohm) coaxial cable. The signal generator produces a voltage pulse that can be measured on an oscilloscope. The signal amplitude is normalized at the oscilloscope by adjusting the signal generator’s amplitude. The measurement setup is shown in Figure 1 and the measurements are shown in Figures 2-7. Table 1 shows the values used in the test setup.