Measurements at high voltage and high frequency

High voltage probes used with oscilloscopes are often labeled as having a wide bandwidth well into the MHz region. For example, a popular 1000:1 probe is the Tektronix P6105A High Voltage Probe which is labeled as having a 75MHz bandwidth. 

This rating can often trick the laboratory engineer into assuming the probe is accurate and traceable at high frequency. Unfortunately, this is far from the true. 

A close inspection of the probe's specification will usually find that the basic divider accuracy is only valid at dc, or at most 50/60Hz (i.e. mains frequency). Also the probe's calibration is normally done at dc or mains frequency, meaning that measurements above these frequencies are untraceable.  

Even so, one might reasonably expect that equipment labeled with a bandwidth of 75MHz should be fine to use at frequencies of up to 1MHz, even if not strictly traceable. Sometimes this is true, but not always. 

Laboratory technicians working in high frequency should understand there are two types of equipment: the first has an intrinsic flat response, the second type achieves a wide bandwidth through compensation.

For equipment with an intrinsic flat response, calibration can be considered valid for the frequency range specified by the manufacturer, even though the calibration report may only cover limited frequencies. For example, multimeters usually specify a frequency range in which the accuracy remains valid - this range can be trusted. Care should be taken if the specification is given as a "3dB bandwidth": this is a headline grabbing specification, and rather useless since at the 3dB point the error is -30%. A laboratory should only use such equipment up to 10% of the 3dB bandwidth, where the effect is less than 0.5%. Nevertheless, within the "laboratory bandwidth", simple systems with inherent flatness can be relied without having to worry about the effects of frequency. 

In contrast, equipment that uses compensation should only be considered calibrated if it has been specifically checked at the frequency of interest. Devices that use compensation tend to have a wobbly frequency response, with no particular rhyme or reason where the peaks and troughs occur. The frequency response can unstable with time and temperature, and in particular when external probes are used, sensitive to the overall system set up including the cables used, scope used and the physical location of the probe with respect to ground planes and the voltage source. Calibration is really only valid for one particular frequency and one particular set up.  

HV probes fall into this latter case. The root cause of the problem is heat: in order to avoid the probe getting too hot, HV probes need to have huge resistance, typically 100Mohm. With such a high resistance, stray capacitance quickly becomes a problem as the frequency increases. A typical HV probe has a intrinsic bandwidth of only 1kHz, limiting the useful laboratory range to just 100Hz.

Probes overcome the frequency limitation by using compensation: a parallel capacitance is added to the resistive divider (roughly 3pF:3nF). At dc and low frequencies, the resistive divider works as expected, providing 1000:1 ratio with the capacitance having negligible effect. At around 1kHz, both the resistive and capacitive dividers are active, and by 10kHz the probe is fully operating in the capacitive region, where the resistors have no effect. As the frequency increases further additional compensation may be required as stray capacitance and other high frequency effects take thier toll. Although compensation adjustments may be performed, these are usually approximately only and not intended to guarantee laboratory accuracy. For HV probes, the compensation may be affected temperature and the individual set up. 

This means that the calibration (normally done at dc or 50-60Hz) is completely meaningless for measurements above around 100Hz - traceability is completely lost. 

This is not just a hypothetical analysis. Actual measurements have confirmed that at high frequency HV probes typically have errors in the range of +/-10%. A useful illustration of the high errors these probes have is to randomly sample two probes (most test laboratories will have more than one HV probe around), and connect them to the same circuit with a high frequency high voltage source (e.g one probe to Ch1, one probe to Ch2 of the same oscilloscope). This experiment has been performed many times in many laboratories with the two probes rarely agreeing, and typically registering difference in the order of 5 - 15%. The largest error found so far when a single "calibrated" HV probe was checked against a reliable source was an error of +14%.

It is worth noting that oscilloscopes can also use compensation to achieve thier full bandwidth - although the errors are usually smaller they can still be significant especially for lower cost devices. They can also be range dependant - the 100mV/div range might be flat, while the 10V/div has high errors with frequency. Where accuracy is important, it is important not to assume anything when dealing with high frequency. A manufacturer that concretely specifies the accuracy limits including the frequency ranges can usually be trusted; a manufacturer that specifies accuracy at dc only and vaguely refers to a 3dB bandwidth is almost certainly hiding something. 

In many applications errors of +/-20% are not so critical, and many people would consider this to be reasonable given the complications of associated with high voltage at high frequency.

However for HF insulation testing against IEC 60601-2-2 the errors can be significant. The standard only requires a test at 120% of rating, a small margin which is necessary as the clinical applications can often limit the ability to provide thicker insulation. If the measurement equipment has an error of 20%, this tiny safety margin would be wiped out altogether. At high frequency, insulation can heat up significantly as a function of the square of the rms voltage - an error of 10% in the applied volatge results in a 20% effect on heat. And clinical literature indicates that insulation breakdown remains a problem in HF surgery, with serious outcomes including death. It is not an area to be treated lightly.  

MEDTEQ has invested significant effort designing a 1000:1 divider for the HFIT (high frequency insulation tester) accurate to 2% in the 300-500kHz region, as well as reference method to calibrate the divider which is accurate to around 0.3%. Traceability is provided though thermal methods (power and heat should be the same irrespective of frequency), with at least two reference methods used to improve confidence in this difficult area.

More work is being done with a prototype laboratory reference meter in the works having an intrinsic accuracy of better than 0.2% in the region from 100-300Vac, 50Hz to 500kHz. Such a meter could be used to calibrate an oscilloscope/probe set up at a specific frequency (e.g. 400kHz for HF insulation tests), which would then allow that set up to be confidently used for peak and rms measurements at normal test voltages.

In the mean time, a quick fix solution can be to use a good quality digital function generator. Although not strictly traceable, good quality generators typically have excellent flatness if used below 10% of thier headline bandwidth (e.g. 1.5MHz for a 15MHz generator). Thermal base tests have found less than 0.3% variation in the 1kHz to 500kHz region for the NF Wavetek WF1944, and similar result for Tektroniks equipment. If the generator is flat, traceable measurements can be made by anchoring at a low frequency (e.g. 1kHz) and then ramping up the frequency without making any other changes, as in the following procedure.

First set the output to 7Vrms, 1kHz sine wave and connect the output to a calibrated multimeter (preferably a high quality, 6.5 digit meter). Adjust the function generator voltage or amplitude to give exactly 7.000Vrms as your reference (this is the low frequency anchor, which is traceable). Then disconnect from the multimeter and connect the generator output to the HV probe and monitor the output on a oscilloscope. Use only "ac rms" measurements: this will reduce the impact of noise and dc offsets which can be significant since the input to the scope is just 7mVrms. Check the displayed ac rms voltage with 1kHz, 10k, 50k, 100k, 300k and 500kHz, or at the frequency of interest.

This method works provided the scope noise is low enough, and it assumes that the oscilloscope has a flat response. To test for noise, set the function generator output to zero and and measure the ac rms value coming from noise. A noise level of 1mVrms at the input (1Vrms, at the probe) will contribute 1% error to a 7Vrms input. To test for oscillscope flatness, repeat the above test but connecting the generator output directly to the scope input. 

Information prepared by: Peter Selvey, MEDTEQ

Updated 18 November 2014