[R-390] Ramblings on Calibration - was: HP 8640B

Roy Morgan roy.morgan at nist.gov
Thu Apr 7 10:43:42 EDT 2005


At 04:54 PM 4/6/2005, John Lawson wrote:
>On Wed, 6 Apr 2005, Cecil Acuff wrote:
>
>>choked when he told me what it (calibration) would cost.  Wanted $800.00.
>   However - it is possible that the cal lab might offer a "limited cal" 
> or a "reference only" cal -

John, Cecil, and others,

(From the great cal lab in the sky... err... Gaithersburg - NIST, though I 
don't do anything of that sort in my job here....)

Here are some ramblings about the topic of calibration:

On "Accuracy of Measurements":
   The expensive cal labs have three or four things we normally don't: 1) 
Fancy, specialized equipment to do the calibrations with  2) Reference 
standards "traceable to NIST" which costs them to keep current  3) Well 
worked out, "accepted" and proven procedures  4) Trained, experienced 
people to do the work

For example, in the measurement of frequency, as done in the calibration of 
a signal generator, the lab might well have a satellite-connected local 
frequency standard that is continuously monitored for drift and 
error.  This thing is part of a system that does the monitoring, allows for 
periodic higher-level calibration, reports it's condition, and provides 
suitable output frequencies for use in calibrating other 
equipment.  Nowadays it's unlikely that such a frequency standard it 
actually sent to NIST for calibration because satellite transfer methods 
exist.  This establishes the "traceable" nature of that measurement.  The 
procedures used in frequency measurements are probably not too complicated, 
but if you were to get a signal generator calibrated, you will be paying 
them to have a trained, experienced technician carry out the measurements 
according to the established procedures.

Measurement of frequency is easy out to 10^-12 or so, but that kind of 
accuracy in voltage and power measurement is not feasible.  A check of the 
manual specs on the HP 8640 will show that the accuracy for output level is 
far lower than for the frequency.


On "Why do we do this?"
   Some folks like John, KB6SCO, are responsible in an organization to make 
sure things are correctly calibrated. The organization's goals and 
customers require it and he has a full time job making that happen.  Few 
Amateur radio stations need such calibration.  Some Amateur radio folks 
WANT to know that their equipment is calibrated correctly.  If a frequency 
counter is a little bit off, they won't like it, but a customer's crucial 
communications links won't degrade or fail.  Some of us just like to mess 
with measurements and enjoy knowing our equipment is working right.  For 
example, a while ago I got a General Radio Precision Capacitance Bridge 
that joins a similar inductance bridge here.  I don't yet know if they are 
working right, but I expect to check them whenever I can.  I have no 
earthly reason to measure inductors or capacitors to 0.05 percent!

On "What can we do about all this?"
   For many of us, doing it at home is a fine thing.  Getting an instrument 
and an invoice for a big amount back from a Cal Lab is likely not in our 
future.

   At home we can check frequency counters and generators with very small 
errors.  Receiving the WWVB signals at 60 kc is the start of a system that 
gets you well within the specifications of most oscillators found in 
frequency counters and generators.  For a couple hundred dollars you can 
buy a standard oscillator (surplus from the cell phone industry, as I 
understand it) that is extremely accurate.

Measuring resistance, inductance, and capacitance is a bit more 
complicated, but you can find reference standards at hamfests or on the web 
that will make a start at a set of known values.  As you dig into the 
methods of measuring these values you will begin to understand the value of 
the setup and the methods you use and how they affect the results.  It 
takes a small table full of inductance standards to calibrate a high 
accuracy inductance bridge, and these things have sold at fests and auction 
sites for $300 apiece and more.

Measuring voltage is even tougher at home.  Systems to measure voltage to 
high accuracies is very complicated, often involving shielded rooms, 
calorimetric methods, quantum physics based references, and so on.  Voltage 
reference instruments can be had on the used market the would be quite 
useful to check our DMM's however.

Careful web searching will reveal a lot of good information about 
electronics calibration.  The Agilent web site has interesting reading. For 
example, I recently found a publication on Tips for Making Accurate 
Measurements or some such title.  Any one of the setup diagrams in that one 
included at least $50,000 worth of their equipment, so it wasn't all that 
useful.

The NIST website, www.nist.gov, has lots of papers and reports but you have 
do dig for them (the search engine is terrible!) and you'll read about 
methods and techniques impossible anywhere but a national calibration 
laboratory in many cases.  (How many of us have a cryogenic system capable 
of cooling a Josephson junction array to four degrees Kelvin?)

I think there are many things we can do in our basement workshops to both 
check the accuracy of our equipment, and give us many happy hours of time 
at the bench.  If you want to know whether your R-390A is hearing 1 
microvolt or nothing lower than 50, you have a shot at it.  If you need to 
know that your signal generator is giving you one half microvolt to three 
decimal places, you probably have a ways to go.

In the meantime, do have fun, and tell the rest of us about what you are up to.

Roy

- Roy Morgan, K1LKY since 1959 - Keep 'em Glowing!
7130 Panorama Drive, Derwood MD 20855
Home: 301-330-8828 Cell 301-928-7794
Work: Voice: 301-975-3254,  Fax: 301-948-6213
roy.morgan at nist.gov --



More information about the R-390 mailing list