[Laser] variable field of view for noise elimination

TWOSIG at aol.com TWOSIG at aol.com
Tue Jul 18 20:55:57 EDT 2006


 
Glen and all


I think that I misunderstood the placement of the  iris.  In any case, please 
forgive my poor communication.  I used the  term "a device similar to a 
camera iris" as a way to suggest what might work for  the adjustable field stop in 
the previous paragraph.  I think that you may  have thought that I meant to 
place the iris near the lens. The intent is to  place a mask at, or near the 
focal plane of the optical system.  The image  formed by the optical system, to 
use a comparison to a camera, will fall on the  mask, with the signal and noise 
sources forming blobs in a pattern ( rotated  upside down ) that matches what 
we would "see" if we looked in the direction  that the system is pointed.  
The idea is to move the mask relative to the  pattern so that the signal "falls" 
through the hole in the mask onto the  sensor.  Much the same way you 
describe in your later paragraph.
 
As I understand your suggestion, it is  to place the sensor at the  focal 
plane with the mask close to it.  The arrangement has merit and will  result, as 
I tried to explain, in the maximum full signal field of view, minimum  
ultimate (or zero signal) FOV, hence the sharp edged FOV.  A close spaced  iris in 
effect matches the field of view to its open area, but then it will also  in the 
more general case where the sensor is mounted aft.
 
I am not sure if you intended to suggest that the mask and sensor be  
moveable relative to the optical axis of the instrument.  Since the  instrument as a 
whole could be pointed in the desired direction,  independent motion would 
seem un-necessary, at least at first thought.   However, I can see that there 
would be an advantage if the whole instrument is  difficult to control for fine 
adjustment.  In that case, the instrument  would be pointed in the general 
area, then the sensor/mask assembly would be  moved in small increments to the 
proper alignment.

The tree leave shadow  effect that you describe is not caused by diffraction. 
 It is  simply multiple pin-hole cameras and can easily be duplicated with a 
piece of  aluminum foil with several small holes punched in it, then mounted 
in picture  frame so that you can hold it.  A fitting subject would be the Moon 
about  half full.  The shape is obvious and it is bright enough to see the  
images.  The larger the holes, the larger the circle of confusion, but  lenses 
are not needed.  I have a basement window that is at least 6 inches  high and 
18 inches wide.  With that I can see the lanscape in my front yard  and even 
see cars driving along the street.  Not very good resolution, but  certainly 
not a diffraction effect.
 
Perhaps I do not understand your point, but I do not see that diffraction  
effects would be significant.  Any noise blob ( And I like the way you used  the 
term blob. ) that falls on the mask would be completely blocked.   Ideally 
the hole would be large enough to accept all of the signal, but even if  it was 
not, the interaction with the hole's edge ( diffraction effects ) should  not 
degrade the signal much, since the number of photons passing through the  hole 
should vastly outnumber the photons in the diffraction region.   A  noise 
source that falls partly within the hole does concern me, but  again it is the 
photons that pass through the hole that are the problem.  I  have therefore 
assumed that diffraction photons are not a problem.

I  think that a sensor that is about the size of the signal blob at the focal 
plane  would be a bad design.  As I describe in the original post, that 
represents  nearly the minimum field of view for the instrument, so it would be 
degraded by  small vibrations, not to mention that it would be very difficult to 
acquire the  signal.  If you make the signal blob larger than the sensor, 
perhaps by  moving the sensor off the focal plane, you can expand the field of 
view, but you  lose signal gain.  If you have the margins to do that, it will  
work.   At the very least, sensors that small should be part of an  array with a 
much larger area.
 
As a practical matter, the optics available create images much smaller than  
the sensors we can get.  I may be tempted to suggest that plastic fresnel  
lenses used with a 1mm square sensor is a counter example, but to be fair, such  
systems can and do work.  I am trying to suggest that there are better  
optics, even if home made using techniques familiar to amateur telescope makers,  
that if analyzed can produce much better results.  Especially when compared  to 
a plastic fresnel lens mounted in the front of a cardboard box.

I have  read a lot about amateur telescope making.  The quality of optics 
available  with humble materials and more perseverance than talent is pretty 
amazing.   For the type of light communication systems that are being discussed in 
this  group, the optics needed would disgust any visual astronomical 
observer.   They have instruments suitable for cameras and eyeballs.  We need only one 
 pixel.  Granted, we need that pixel sampled at rates from tens to millions  
per second, where they may sample hundreds of pixels for hundreds of seconds, 
or  perhaps a few million pixels perhaps a hundred times in a second.  The  
goals may be different, but the science behind the optics is the same.   There 
are things each interest group can learn from the other.
 
I hope this is useful
 
James
n5gui
 
 




In a message dated 7/17/2006 6:57:57 PM Central Standard Time,  
glennt at charter.net writes:

Interesting idea. However I think you will get some diffraction  
effects from your iris, especially when it is made small. Small holes  
act like lenses as you can see if you've ever used or visited a  
"camera obscura". A more common demonstration is available by looking  
at the shadow created by sunshine through the leaves of a tree. In  
many places the leaves combine to make a very small hole which 
appears  on the ground as a round spot of light. Usually there are 
lots of them.  These are a real images of the sun. It is a lot more 
obvious if you do  this experiment during a partial solar eclipse and 
can watch all of the  fuzzy balls of light in the shadow have a bite 
taken out of them!

A  scheme that is similar in that it uses an iris and moves the sensor 
might  be considered. The light collected by the optical system 
(lenses &  mirrors & whatever) will create a true image of the scene. 
This is  exactly what a camera or your eye does. Consider for a moment 
the entire  image. Your desired signal will appear in that image 
wherever the  transmitter would appear to be. Further, even the best 
optics aren't  perfect, so even a point source (a star for example) 
will appear as a  blob. The size of the blob is a function of how big 
(in steradians) the  source is and how good your optics are.

If the size of the signal blob  in the image is larger than the area 
of the detector, then it's simply a  matter of physically moving the 
detector to the place in the image where  the transmitter appears. If 
the size of the blob is smaller than the  sensor (lucky you!), then 
place the sensor so that the entire blob is on  it and use an iris 
with a blob sized and shape orifice directly in from of  the sensor to 
block noise sources that are outside the blob. Very little  in the way 
of diffraction effects here because the iris is right next to  the sensor.

In the event that the blob is bigger than the sensor, no  doubt 
additional optical processing can be done to further collimate the  
image so it will be sensor sized or smaller. The only problem with  
this is the need for additional specialized optics and their  
associated losses. In any case, bring money - lots of it!

73 de  Glenn wb6w







More information about the Laser mailing list