[1000mp] Inrad Front-end Mod
Ian White, G3SEK
[email protected]
Tue, 20 Jan 2004 07:58:22 +0000
George, W5YR wrote:
>It is required that the output of the 606A be terminated in a 50 ohm
>resisitve load in order for its output level and attenuation
>calibration, etc. to be accurate. If it isn't, the voltage level of its
>output will not be correctly related to the metering and the attenuator
>calibration.
[...]
>If your MK V input is also 50 ohms, then whatever you read with the
>606A should be as accurate as the calibration of the 606A. If it is
>not, then the error will be proportional to the difference between the
>actual input resistance and 50 ohms.
Although both of these statements are true, no corrections are needed in
practice.
The industry standard for signal generators is that when you dial-up an
output signal level, it represents the power that would be delivered
into a 50-ohm load. A 50-ohm output impedance is also standard.
The industry standard for specifying receiver sensitivity is that you
connect the input directly to the output of that signal generator. The
power level that you report is simply what the signal generator says.
You specifically do *not* make a correction for the receiver's actual
input impedance.
So when Yaesu say "-73dBm should read S9", they simply dialed-up -73dBm
on the signal generator and connected it straight into the transceiver.
To check the sensitivity, you do exactly the same.
Likewise, when converting receiver sensitivity from uV into dBm, you
simply assume a 50-ohm system impedance, so power = (Vrms)^2/50.
--
73 from Ian G3SEK 'In Practice' columnist for RadCom (RSGB)
Editor, 'The VHF/UHF DX Book'
http://www.ifwtech.co.uk/g3sek