[R-390] Measuring sensitivity

Mark Huss mhuss1 at bellatlantic.net
Mon Mar 12 18:27:12 EST 2007


Thank you, Roy. Always wanted to work at NIST :) I used to monitor an 
'Atomic' Clock.
As far as the Tangential Sensitivity test, see my msg dated 3/2/2007,

Here is the test setup i use (from memory, you only have to do it once a 
year)

First perform all alignment steps as specified in the TM, including 
setting the IF Gain pot. This is a last, optional way of optimizing the 
noise floor of the R-390- and R-390A which i performed in the Army when 
we did not have to worry about variability in receiver sensitivity.

Equipment Needed:

    * AN/URM-25 Signal Generator or equivalent with External AM
      Modulation Input.
    * Oscilloscope, 500 kHz bandwidth or greater, Single Channel,
      External trigger preferred.
    * Function Generator, Squarewave output, 100 Hz.
    * Twinax to BNC Adaptor.
    * 2 to 4 BNC Cables.

Set up your equipment as follows.

   1. Use BNC cable to connect RF Generator Output to Twinax Adapter.
      Connect Twinax adapter to Balanced Antenna Input of R-390/R-390A.
   2. Use BNC cable to connect RF Generator Ext. Mod. Input to output of
      Function Generator.
   3. Use BNC cable to connect Function Generator Trigger output or a
      BNC T-connector to connect to Function Generator Output to
      External Trigger of Oscilloscope.
   4. Connect Channel A Vertical Oscilloscope input to Diode Load jumper
      on rear of Receiver.
   5. Set up Function Generator as follows;
          * Frequency; 100 Hz.
          * Amplitude; 1 volt PP.
          * Waveshape; Squarewave
   6. Set up RF Signal Generator as follows;
          * Frequency; 1500 kHz.
          * Amplitude; 4 uV.
          * Modulation External.
          * Percent Modulation:100%
          * Monitor High Level Output on Oscilloscope to insure that
            modulation is 100% (carrier wave cut off for half a cycle of
            the modulation waveform). Adjust Function Generator output
            and Ext. Modulation Level input accordingly.
   7. Set up Oscilloscope as follows;
          * Vertical Gain; 1V/Div.
          * Vertical Coupling, AC.
          * Horizontal Timebase 0.005 seconds/div.
          * Horizontal Trigger; Normal|External.
   8. Set receiver up as follows;
          * Power On.
          * Receiver Frequency 1500 kHz. Maximum output level.
          * Bandwidth; 8 kc.
          * AGC; MGC.
          * RF Gain; Maximum.
          * BFO; Off.
          * Audio Filter; Wide.
          * Line Level Meter; off.
          * Audio Level; comfortable.
          * Antenna Trim; maximum level.
   9. What you should see displayed on the Oscilloscope is a 100 Hz
      squarewave with some noise riding on the top and bottom of the
      waveform. Adjust vertical and horizontal trigger levels on the
      Oscilloscope to display full waveform and stationary waveform.
  10. Decrease amplitude of the RF Generator output until the negative
      peaks of the noise riding on the top of the 100 Hz waveform is at
      the same level as the positive peaks of the noise riding on the
      bottom of the 100 Hz waveform.
  11. Adjust the IF Gain pot counterclockwise to reduce the IF Gain
      while observing the oscilloscope. What you should see is that the
      amplitude of the 100 Hz waveform will decrease. At the same time,
      the amplitude of the noise will also decrease at a greater rate
      than the modulation waveform, causing the gap between the negative
      noise peaks on top of the waveform and the positive noise peaks on
      the bottom of the waveform. Adjust oscilloscope vertical amplitude
      as necessary to observe this.
  12. Repeat step 10 and 11 until the gap no longer appears.

What you are seeing in 9 and 10 is the S+N/N displayed on the 
Oscilloscope. The noise on the top of the 100 Hz waveform is noise 
without the carrier (if memory serves). The noise on the bottom of the 
waveform is the signal plus the noise. Normally the standard way of 
setting the IF Gain causes the IF amplifier to generate too much noise. 
When you turn the IF gain pot down, the IF amp is generating less noise 
as well as less gain. However, for a while the decrease in noise is 
greater than the decrease in amplification, thus the gap reappears 
again. You will get to a point where the generator exceeds 4 uV, or 
there is no longer a gap appearing when you decrease the generator and 
IF gain. This is the point where you will have maximum gain for minimum 
noise through the receiver. And where the RF amplifier will again set 
the noise floor of the receiver. As a final check, redo the standard 
sensitivity test to insure you got an improvement, and that the receiver 
sensitivity still meets specifications. I have seen R-390A's that barely 
meet the sensitivity specification after the standard IF Gain setting 
improve so much after using this IF Gain setting technique that you 
could no longer perform the 10 dBS+N/N test because of generator 
attenuator bleedthrough and cable bleedthrough. So you had to go to 20 
dB S+N/N. Other receivers would keep getting improvements in the gap, 
then not pass the 4uV sensitivity. I guess they had very quiet RF amp tubes.


Roy Morgan wrote:
> At 12:30 PM 3/12/2007, Mark Huss wrote:
>> ...I have seen traceable measurements down around 0.2 uV (probably 
>> 2khz bandwidth or less, it was not specified). Of course, the problem 
>> here is that most of the readings you see on the Internet are not 
>> traceable (to NIST standards), because calibration is expensive.
>
> Mark and all,
>
> May I comment on the "NIST Traceable" idea that comes up from time to 
> time?
>
> Note: I do work at NIST, but not in RF or Electrical calibration. I 
> simply have a bit of information and some ideas on the topic of 
> calibrations.
>
> If someone tells you that an instrument has been calibrated "traceable 
> to NIST" here is what they likely means: The calibrations have been 
> made in a laboratory which in turn has calibrated its instruments by 
> means of materials or instruments that have been in turn calibrated 
> against something that was actually AT NIST for calibration.
>
> Let's see how this works: Let's say you want to know that your RF 
> Millivoltmeter is accurate.
>
> The NIST website is
> http://www.nist.gov/
> Click on
> NIST Laboratories: provide measurements and standards for U.S. industry.
> ...
> Electronics and electrical engineering
> and get to: http://www.eeel.nist.gov/
>
> The Eectronics and Electrical Engineering Laboratory provides the 
> fundamental basis for all electrical measurements in the United States.
> Then find Calibrations: 
> http://ts.nist.gov/MeasurementServices/Calibrations/
> Then find the AC and DC current and voltage calibrations information at:
> http://ts.nist.gov/MeasurementServices/Calibrations/Voltage.cfm
>
> Among the many many calibrations discussed, one is called
> "Special 25–Point Test of Digital Multimeters (DMMs), by 
> Prearrangement (53202S–53203S) This is a special reduced cost, 
> 25–point test covering all five functions (ac and dc voltage and 
> current, and dc resistance) of most precision DMMs. ..."
> Under this test, we learn that the calibration point of 0.1 volt at 1 
> mc has a minimum uncertainty of 1000 parts per million. That is 0.001 
> volts. This calibration costs $1600.
>
> Reading further, we can see that the uncertainty involved in higher 
> precision measurements (done with thermal conversion methods) is about 
> one third of that. Calibrations of this sort may cost $2000 or more.
>
> So a commercial calibration lab can send its very best millivoltmeter 
> or thermal conversion cell to NIST with a couple grand and be 
> reasonably sure that it is correct within a very small amount. Then 
> the lab standards person compares the millivolt meter to be used for 
> routine calibrations to the NIST calibrated standard to see if IT is 
> right, and the accuracy drops an order of magnitude. Then when you add 
> in the uncertainties associated with calibrating some one else's meter 
> in the setup to be used, the accuracty may drop another order of 
> magnitude.
>
> You can see that if someone reports the sensitivity of an R-390A to be 
> 0.354 micro volts, that person is being quite cavalier with the numbers.
>
> Mark goes on to point out that there are uncertainties in our home 
> test setups, and also "... there are at least three methods to measure 
> S+N/N ratio. All valid. All giving different results everything being 
> equal."
>
> It seems to me that we should pay attention to the setup we have, ask 
> reasonable questions about how to get reasonable measurements, and pay 
> attention to such things as leakage and what methods we use.
>
> THEN, we might be able to say that we measured a radios sensitivity as 
> "0.5 microvolt, but it might be from 0.3 to 0.9" or some such.
>
> There's a lot I don't know about all this. For example, Mark says: 
> "Then we would finish off by adjusting the IF Gain using Tangential 
> Sensitivity readings."
>
> I have no idea what that is, and would like to hear more about it.
>
> Roy
>
>
> - Roy Morgan, K1LKY since 1959 - Keep 'em Glowing
> 13033 Downey Mill Road, Lovettsville, VA 20180
> Phone 540-822-5911 Cell 301-928-7794
> Work: Voice: 301-975-3254, Fax: 301-975-6097
> roy.morgan at nist.gov --
>


-- 
----------------------------------------------------------------------
Some people are like a Slinky .. not really good for anything,
but you still can’t help but smile when you shove them down the stairs.
----------------------------------------------------------------------


More information about the R-390 mailing list