Discussions with various members of the electromagnetic compatibility (EMC) community that deal with MIL-STD-461 testing revealed that various methods were being used to limit the applied power in CS101 testing. The standard calls applying a test voltage level but allows for the requirement to be met if the interfering signal power source is adjusted to dissipate the power curve in a 0.5-Ω load. This review examines test procedures used to accomplish the testing and identifies variances encountered along with considering alternate methods that may be used to meet the goal of the test standard.
2. MIL-STD-461G Procedure
2.1 CS101 Calibration
MIL-STD-461G specifies a calibration configuration as shown in Figure 1. The calibration procedure steps are:
a. Set the signal generator to the lowest test frequency
b. Increase the applied signal amplitude until the oscilloscope indicates the voltage level corresponding to the maximum required power level specified for the limit as shown in Figure 2. Verify that waveform is sinusoidal.
c. Record the setting of the signal source.
d. Scan the required frequency range for testing and record the signal source setting needed to maintain the required power level.
The open issue is what defines the signal source.
One could easily use the signal generator amplitude readout as the setting if the amplifier gain remains stable or does not require adjustment. The range of calibration power through the frequency range often calls for amplifier gain adjustment to …