Light field microscopy is a useful tool that allows microscopists to collect three-dimensional information about a scene in a single snapshot. Let us first look at what comprises a light field microscopy system and what design choices are involved. Next, we'll walk through the required materials and step-by-step instructions on a specific light field microscope system. Lastly, we'll look at how to calibrate that particular design, along with notes on how to calibrate light field microscope systems in general.
|
A light field microscopy system consists of the following:
A standard optical microscope imaging system can be converted into a light field microscopy system with the addition of a microlens array and postprocessing software. An optical diagram is shown in Figure 1.1 denoting the various lenses, stops and sensors in the system. The microlens array is inserted at the original intermediate image plane, where the camera would originally focus for standard microscopy. The camera itself is moved back so that it focuses on the plane one focal length behind the microlens array (the back focal plane). The camera produces raw light field images consisting of a lattice of small circles, which are then processed by postprocessing software to either allow interactive exploration of a specimen or to create focal stacks for further analysis.
The main aspect of light field microscopy system design is that of the microlens array, although other things such as choosing a camera, designing a housing for microlens arrays and choosing an objective are also important.
The first thing you should ask yourself when designing a light field microscopy system is: what's the smallest feature size I need to be able to resolve in the final output? For example, you might want to be able resolve neurons that are five microns across in size. Remember that you cannot push this feature size too small, or you will lose the ability to refocus due to the diffraction limit, as explained in the previous section.
Next, you should determine what objective you would like to use. The main choice here is really for the field of view, as opposed to smallest resolvable spot. You want to determine how much of the specimen you need to be able to image at once, and hence should probably choose an objective with the highest NA that is both appropriate for your specimen and will be able to image a field large enough.
There's often not too much choice involved in obtaining a camera, but if you do have a choice, you should opt for a camera that has as many pixels on the sensor as possible, assuming it fulfills your sensitivity requirements (and framerate requirements for streaming applications). This is because the actual output pixel count in a rendered focused image from a light field is the sensor pixel count divided by the number of pixels that fit inside a single microlens.
As a caveat, the camera and objective choices do affect one another. For example, if the camera sensor doesn't image the entire intermediate image plane, then you might want a lower magnification objective in order to fulfill the field of view requirement. Also, ideally we want the sensor pixel size to be roughly half the size of the diffraction spot size of the objective we chose; if the pixel size is too small, we don't get too much extra information from the light, and if the pixel size is too large, we waste information from the captured light, as we can't resolve the finer angular content in the light field image.
Now that we have chosen an objective, a camera and a feature size, we are ready to look at microlens designs. There are two main parameters involved in the design - the pitch and focal length. As for the shape of the microlenses, we should use microlens arrays composed of abutting square aperture lenses in a lattice.
The microlens pitch should be roughly equal to or smaller than our feature size multiplied by the magnification of the objective. For example, if we want to image five-micron neurons using a 40X objective, we should have a microlens pitch equal to micron at the largest; 100 micron pitch microlenses would be preferable. If you are flexible with feature sizes, then go for a slightly larger microlens pitch to achieve better angular resolution (allows more distinct focal planes from a refocusing operation).
Given a microlens pitch and objective, we can calculate what would be an ideal focal length for the microlenses. Recall that for each microlens in the array, we get an image of the aperture of the objective (back focal plane image), which is usually a circle. Shorter focal lengths yield smaller circles, and if the focal length is too short, you waste pixels on the sensor. Longer focal lengths yield larger circles, but if the focal length is too long, your circles overlap in the light field and the overlapping parts cannot be used in light field rendering. Ideally, we want these circles to almost touch, but still have somewhat of a gap between them (to be resilient to diffraction effects).
Now let's look at how to calculate the focal length to have the circles exactly touching. The sine of the maximum half angle of rays passing through the intermediate image plane is equal to the NA of the objective divided by the magnification. For example, a 40X/1.3NA oil objective will create rays that at most hit the intermediate image plane at a degree angle. Let us suppose we went with 100 micron microlenses. In order to have the circles abut, we want a ray from the center of the microlens stray 50 microns from the optical axis at a distance from the microlens equal to the focal length. Thus, we can solve for the focal length to be: microns. This is the maximum focal length we would want to use with this objective and microlens array pitch. Perhaps a or micron focal length would be a safer bet.
To summarize, in this example, we would want to use a 40X/1.3NA oil objective with a 100 micron pitch microlens array with at most 1540 micron focal length in a light field microscopy system to image 5 micron neurons. However, keep in mind that often constraints will prevent the creation an ideal setup, but as long as the design specifications are in the right ballpark and erring on the right side, then you should be okay. For example, it's okay to have the focal length be shorter than the ideal.
One of the first prototypes of a light field microscopy system was constructed from the following:
|
|
In order to insert the microlens array at the proper height, a microlens array holder was custom machined. The holder for the Nikon microscope consists of the yellow and green sections of the design in Figure 1.2 and the blue section of the design in Figure 1.3. The three pieces of the microlens array holder are:
|
Since the Retiga 4000R's sensor plane is inside a sealed chamber, relay optics are required for the Retiga to be able to focus on the back focal plane of the microlens array. Therefore, two Nikon 50mm f/1.4 prime lenses were connected nose-to-nose using a specialized 52mm filter ring that attached to lenses on both ends. The F-mount on one end attaches to the Retiga and the F-mount on the other end attaches to the microlens array holder. The finished construction can be seen in Figure 1.5
|
A Dell XPS 700 workstation was used for this setup. It had a 2.67GHz Core 2 Duo processor from Intel, 2GB of RAM and a Geforce 7900 GTX graphics card from nVidia. The real-time light field imaging application, LFDisplay, and other Python modules for computing focal stacks were developed and installed on this computer. The Geforce 7900 GTX offered decent enough performance, although it was replaced by a Geforce GTX 260 Core216 later after hardware failure.
Calibration of the light field microscopy essentially involves two parts:
|
If the camera does not correctly image the back focal plane of the microlens array, the resulting captured light field image will be blurry if misfocus is by a small amount (resulting in loss of angular resolution) or downright unusable if misfocus is by a large amount. The following is one possible way to make sure the distance between the microlens array and the camera is correct.
Since the image behind each microlens is an image of the angular spread of light, the best way to check for focus is to have each microlens try to image a very narrow set of rays. This is accomplished by stopping down the condenser aperture so that the light rays hitting the specimen is roughly parallel to the optical axis. Ideally, we should see a lattice of very small almost point-like plus signs, as shown in image (b) of Figure 1.6. This is the diffraction pattern induced by the microlens array. However, if this is not the case, let us continue with the calibration procedure
If possible, insert a color filter into the imaging system such that all images are roughly monochromatic. This will help with spotting diffraction patterns later. A red filter is the best, as it produces the largest diffraction patterns. The color filter can either be inserted into the trans-illumination or can be an excitation filter in a filter cube.
What we have done is to move the camera conjugate plane to the microlens array plane (and not its back focal plane). The microlens edges will cause this grid pattern to appear. This step is not entirely necessary, but it does make it easier to locate the point of best focus, by starting the camera at a known distance from the microlens array.
The total amount of movement required here should be approximately equal to the focal length of the microlens array. Once the image you see from the camera resembles image (b) of Figure 1.6, you are now at the proper position. Note that if your camera pixels are rather large, or if you have not inserted an optional color filter into the microscope, you may not see clear diffraction ridges or the plus sign shape at all. In that case, aim for the smallest sized spot, or adjust until the center pixel of each spot has the highest brightness.
This is not really necessary, but it helps with the rectification of light fields (conforming light fields so that each lenslet image can be easily extracted).
|
Once we have calibrated the microlens-to-camera distance, let's adjust the parfocality of the setup. This is not as crucial as the previous section, but it will help increase image quality somewhat (due to objectives being corrected for a specific intermediate image plane) and will make it easy to switch between the eyepiece and the light field.
If your specimen wasn't entirely flat, remember which parts were of the specimen were in focus.
If an object is in focus, then the object is conjugate with (some) focal plane. If that is the case, we generally expect the same intensity/color ray to emanate from a single point on the object. For a light field, this means that if an object is conjugate to the light field original plane of focus, then each ``circle'' you see in the raw light field image should be of uniform intensity/color.
Note that if there are angular nonuniformities in the illumination, or if there is a mismatch of index of refraction between objective and specimen, then it is possible for the circles to not be uniform. In that case, simply adjust them to be as uniform as you can.
This is more of a convenience step to aid in switching back and forth betwen an eyepiece and a light field display.
Now let us look at a few use cases of the light field microscopy system along with how to operate the software. We will go through installation of the software as well as a few tutorials on some standard uses of the system.
Postprocessing software is an integral part of the light field microscopy system. One such software package is a real-time light field viewer/recorder called LFDisplay. LFDisplay can be used to monitor light field data coming from a QImaging camera and has been tested with the Retiga 4000R cooled CCD camera. LFDisplay can also record and/or playback light field image sequences. For the latter use case, the images can be captured from other camera software and imported into LFDisplay, although this will no longer be real-time. The other software package is a set of Python modules that allow a more computer-savvy user to process light field images and compute focal stacks offline. Both software packages are open source and can be obtained from the same place as this document.
The minimum system requirements for LFDisplay are (keep in mind that LFDisplay is research-quality code and may not work with certain configurations):
Installation of LFDisplay is quite simple. For Windows, download the ZIP archive and extract its contents. The folder inside whose name starts with ``LFDisplay'' is where the program executable will be found. To run the program, simply run the file LFDisplay.exe in that folder. For Mac, download the DMG archive and open it. Then, drag the LFDisplay.app program from the DMG archive into your Applications folder. LFDisplay is now ready to run. Please skip to the next section to get started using LFDisplay.
Let us now use LFDisplay to interactively explore a slide placed into the light field microscope system. This requires a graphics card that supports OpenGL 2.0 and a QImaging camera (a Retiga 4000R is recommended).
Preferably, this should be a relatively thick specimen. Feel free to use either brightfield trans illumination or fluorescent excitation depending on what brings the most contrast in the specimen.
|
Your screen should look somewhat like the screen in Figure 2.1. You should see five tabs labeled ``Input'', ``Output'', ``Display'', ``Lenslet'' and ``Optics'', as well as a control bar that has play, pause and record buttons as well as a display mode selector. If any of these are missing, select ``View'' in the menu bar and enable them.
You should see a black square on screen. You can pan the square by dragging with the right mouse button (or control-drag with the left mouse button on Mac). You can zoom in and out by either using the scroll wheel on the mouse, two finger scroll on a Mac trackpad or by pressing Ctrl and + simultaneously or Ctrl and - simultaneously (Use the Command button on Mac instead of Ctrl).
This connects LFDisplay to the camera. You should now see an image coming from the camera. If no camera is connected, or if all cameras are in use, LFDisplay will show an error message. Sometimes, if camera software did not close cleanly, a camera might be stuck in a ``used'' state. Please switch the camera off and on to see if that fixes it.
There should be a row of tab selectors along the bottom of the panels.
|
This ensures that we get an accurate picture from the camera. Your display panel should look somewhat like what is shown in Figure 2.2.
|
Ideally, in order for the information captured to be correct, we want none of the pixels to be anywhere close to being at top brightness (white).
|
You may need to scroll down to locate all the fields. LFDisplay needs information on the microlens array pitch, focal length, the current objective magnification, NA and the specimen index of refraction. If any values don't make sense, such as the NA exceeding the specimen index of refraction, the panel will warn you with the color red. Make sure ``Paraxial approximation'' is unchecked.
You may wish to do so in order to quickly bring up parameters next time using the ``Load'' button.
You should see a green grid (lenslet centers) and a red grid (lenslet boundaries) overlaid on top of the camera image. The main idea here is to set the grid orientation and size such that the circles in the raw light field image are neatly contained inside one red box, with green crosshairs through their middle.
Generally, the microlens array may not be perfectly aligned to the imaging system, so some amount of manual rectification is required. Furthermore, bumping the light field microscope setup may jitter the camera with respect to the microlens array. In these cases, readjustment of these values may be necessary, but a ``Reset'' isn't necessary.
|
You may need to zoom in or zoom out to see the red circle clearly. The circle should be roughly the size shown in Figure 2.5.
For the number controls in this panel, you can either type in values, click the top/down buttons, or use the up/down keys. Mouse wheel scroll is also an option. Page up or page down will move these values faster, as well as Ctrl-scroll (Command-scroll on Mac).
Click on ``Save'' to save the lenslet configuration. The light field image has now been manually rectified. As a sanity check, inspect random parts of the image to see if the lenslet images are still inside the red grid boxes. Sometimes, with geometric distortion/field curvature, the registration/alignment may not work, so just get it close enough. Also, check to make sure the center lenslet image is neatly outlined by the red circle. This should be the case if the NA, magnification and microlens parameters were specified correctly. If the lenslet image is too small compared to the red circle, either the microlens array isn't at the correct focal distance from the camera focal plane, or there's an extra stop in the system somewhere.
After all checks are done, we are now ready to explore the specimen.
You should now see a pinhole image rendering of the scene. The depth of field will be much greater than the depth of field as seen through the eye piece.
By dragging the left mouse button, we are ``panning'' the sequence. To be more precise, it is as if we are viewing the scene through a pinhole in the telecentric stop of the objective, and moving this pinhole around. This should give you a sense of the depth of the specimen.
Now, we should be able to refocus on different depths in the scene. This is equivalent to opening up the aperture stop in the objective, as opposed to using a pinhole.
If your graphics card isn't fast enough, reduce the number of aperture samples for non-pihole rendering.
Keep in mind that negative values means we're focusing further away from the objective (deeper into the specimen) and positive values means we're focusing closer to the objective (shallower parts of the specimen).
|
Here we can save the raw image data from the camera so that we can explore these images later.
|
The output name will turn red if the file already exists. Make sure the image type is ``TIFF''.
Remember the filename and location, as we'll go back to it in a later step.
This starts the movie recording mode of LFDisplay and appends a frame number to each recorded TIFF file.
Now that we have captured some images, we can view them in LFDisplay.
Now, we can explore this specimen as captured at that specific time.
Note that none of the light field operations (panning/refocusing) are recorded. In fact, LFDisplay simply records raw frames from the camera in order to retain the most information from the scene. Remember to save lenslet rectifcation settings when recording files if you wish to revisit captured data later in LFDisplay.
This causes LFDisplay to display a light field movie looping continuously, simulating a live, animated scene.
Now, the animation should be playing at 4 frames per second.
The animation should now be stopped.
This allows you to view a specific light field in the sequence.
You can load this information later to get a specific sequence of frames.
This document was generated using the LaTeX2HTML translator Version 2002-2-1 (1.71)
Copyright © 1993, 1994, 1995, 1996,
Nikos Drakos,
Computer Based Learning Unit, University of Leeds.
Copyright © 1997, 1998, 1999,
Ross Moore,
Mathematics Department, Macquarie University, Sydney.
The command line arguments were:
latex2html lfmintro.tex -split 0 -local_icons
The translation was initiated by Zhengyun Zhang on 2010-03-22
Zhengyun Zhang 2010-03-22