[Laser] Scintillation and Adaptive Optics
James Whitfield
n5gui at cox.net
Tue Aug 21 23:09:14 EDT 2007
I am puzzled by recent comments by Dieter, dl7udp, and Terry, W5TDM, about
the possiblility of adaptive optics being able to reduce scintillation. I
know that my concept of optics is more geometry and Newtonian than waves
with diffraction and phase characteristics, but I just don't "see" a
mechanism how adaptive optics can help, much less be affordable for the
experimenter or practical communications.
Since I usually ramble a bit trying to explain myself, I will try to put
forth the three scenarios that I have come up with, then try to explain two
of them. The first, and I believe the most responsible for scintillation,
is a change in the photon flow into the receiver aperture. For this I see
absolutely no mechanism for adaptive optics to work. The second, I will
call "image dance", does have a mechanism that I believe would cause
amplitude variance. The problem with it is that it should not affect a
practical optical receiver system and if it did, the effects should not be
enough to account for the scintillation that is observed. The third is a
random phase distortion of the wavefront that results in time varying
cancelation/reinforcment of the intensity on the sensor. This third
scenario is so fuzzy in my head I will not even try to describe it, much
less suggest what adaptive optics might do about it.
Now for the ramblings: In layman's terms the twinkling of the stars is
scintillation, just at a speed that we can see rather than hear. Stars are
so far away that we can treat them as point sources. To model the flow of
light to the eye, imagine a long isoceles triangle with the star at the
"point" ( the angle between the two long equal sides ) and the "base" ( the
short side ) equal to the diameter of the iris. In this model we can
represent the flow of photons from the source as an arc moving with time
away from the source. The photons that are inside the triangle can be seen,
those outside the triangle cannot. Now to model the atmosphere we place a
transparent object so that at least some of the object is inside and some is
outside the triangle. This object changes randomly with time in shape and /
or density so that the photons that pass through it change direction. Some
of the photons will change direction enough that they will exit the triangle
if they were previously in it or will enter it if they were previously
outside. The more dense the atmosphere, the longer the path through the
atmosphere, or the more turbulent it is, the more likely the photons will
shift into or out of the triangle. ( Stars twinkle less on mountain tops
than at sea level, more close to the horizon than overhead. ) If you
increase the aperture ( the base of the triangle ) in this model you will
increase the number of photons received. ( The model is two dimensional
instead of three so it would only increase in proportion to the aperture
with no change in the number of in/out photons. In the real world the
capture area is proportional to the square of the diameter and the in/out
photons would increase proportional to the perimeter, that is linearly. )
The end result is that the twinkle becomes a smaller fraction of the average
light received. ( For this reason children and the elderly probably see
more twinkling of the stars. Children have smaller eyes. The ability of
the eye to dilate decreases with age. )
>From this model, which I believe accounts for most of the scintillation
effect, I cannot see how adaptive optics would have any effect on the number
of photons entering the aperture of an optical instrument.
The second scenario presumes that the distortion from the atmosphere does
not change the photon flux reaching the instrument apeture, but rather
changes the direction that the image enters the instrument. If you could
watch the image of the source, it would change the location where it falls
on the instrument's focal surface ( in a camera it would be the focal
plane ). I think of it as the spot on the photo sensor as "dancing"
arround. Now for our purposes the optical sensor ( a one pixel camera )
does not need to be, and I suggest there are valid reasons that it should
not be, at the focal surface of the instrument. Further the image of the
source we are trying to detect can be a very fuzzy patch, though it would be
nice if it did fall entirely on the sensor. In this scenario the varying
amplitude detected could be caused by the fuzzy patch dancing to, and partly
over, the edge of the sensor. The fraction of the light within the
overshoot of the spot would be the amount of lost signal.
I can readily see how adaptive optics would be able to stabilize the dance
of the spot, which in an imaging camera would reduce the distortion caused
by turbulence, but I do not see it as having useful value on real
scintillation.
To carry the analogy further, in the experiment by KD0IF and N6IZW the range
was 21 miles or 1.33 million inches with a transmit aperture of four inches.
The source would then be three microradians. I forget what the receive
instrument what, but assume that it had a 1000 mm focal length and a 1 mm
sensor, yielding a field of view of 1000 microradians. That is an awful
lot of dancefloor for the received spot. I do not see adaptive optics being
of any real benefit for communication.
James
n5gui
More information about the Laser
mailing list