[1000mp] Re: Rig Comparisons - ARRL Lab Test Results
Tom Rauch
[email protected]
Wed, 13 Feb 2002 17:31:24 -0500
> I have had them all and didn't notice any real difference in real
> world operation.
Depends on your site noise level, and how you operate.
Here at my QTH on 160 meters, noise floor is typically somewhere
around -120dBm to -135dBm. The strongest signals are typically -
40 to -50 dBm. I need an IMDR of at least 85dB even on days when
the band isn't crowded to be safe, and preferably more!
Even ONE -50dBm signal can cause problems, as can a bunch of
weaker signals. Remember the accumulated power in the "window"
is the problem, not the level of one or two signals! Several weaker
signals can cause the same problems as one or two stronger
signals.
Now if my site background noise was -100dBm, or if I never worked
weak signal DX on CW near other people, I'd likely be very happy
with many rigs.
I think many manufacturers assume people have high noise levels,
and design with that thought in mind. It seems everyone also tests
considering only two or three signals are on the band, and the
strong ones are 20kHz apart.
Useless for me or anyone in a quiet location, although it might be
fine for someone in an urban or noisy location or someone who
uses wide-pattern antennas for receiving.
Consider what these tests are doing. With a 20kHz spaced
blocking test on 1.8 MHz, I might use a receive (desired) signal
frequency of 1820kHz. The test signals would be at 1840 and
1860kHz. What kind of test is that???
That is useless for me and probably for MOST other people.
A more reasonable test for CW is with the receiver set at 1820 or
1826, and the two unwanted signals at 1822 and 1824kHz. For
SSB, the receiver should be set at 1820 in LSB or 1830 in USB
and the undesired signals at 1823 and 1827kHz (generating 1819
and 1831 as IMD).
73, Tom W8JI
[email protected]