Skip to content, Skip to search


Chromatic shift origins measurement and correction

242 bytes added, 05:28, 10 January 2017
add clarity
For fluorescence images from optical microscopes, frrom a colocalization perspective....
*Objective lenses are never perfect, not even really expensive ones that claim multi color chromatic correction. When it comes to imaging diffraction limited objects with a an eg. 1.4 NA lens on a widefield /confocal/SPIM/super resolution system there will still be a unique colour shift in all three spatial dimensions , xyz, for that particular lens/opics combination, and even how for what angle it is screwed ininto the stand.
*Also , the alignment of the fluorescence filters in different cubes might bend different channels to slightly different places on the detector. On the API OMX you can even adjust this... normally its fixed on most systems but might also be tweakable.
*Olympus does all the chromatic correction in the lens, but others do some in the tube lens.
**This is why its not a good idea to screw a Zeiss c-apochromat 100x 1.6 4 oil into an olympus Olympus stand.
*So we are left with a systematic error that can be measured, actually to a precision of 10s of nm by Gaussian fitting bead images , or some other calibration sample (like you do in PALM/STORM type super resolution and single molecule tracking. )
*On a single point scanning confocal the matter is can be made worse by the more complicated optics.
**The Zeiss 510 has a different pinhole per channel... and depending on their positions and correct setup, the images of the bead in different channels is also affected by pinhole position settings.
*So its best to get is as close as is reasonably possible to do in hardware adjustments, then measure the residual error then correct for it.
*There always remain measurable errors (unless you were really really lucky and happen to have a perfect lens, which is a 1 in a 1000 chance at best, plus perfectly aligned fluorescence mirrors/filters)
*OK, so at the simple level, we can assume that the whole field of view is shifted by some 3D shift per channel compared to the reference, usually green channel(ignoring any geometrical distortions or different magnifications or image rotations for different colours.)**So if we measure a few 1 or 0.5 or so micron beads (no need to use tiny beads here) near the centre of the field of view (where optics are best) we can calculate or guestimate the center of mass of the bead images in each colour channel, and work out the shift vectors needed.
**These will be between 10-1000 nm in xy and up to 500 or even 2000 micron in z.
**BUT, they will never be whole xy pixels or z slice shifts... forget that idea its too crude.
*Now we can use Erik M's TransformJ Translate plugin in Fiji/ImageJ to do a sub pixel resolution shift for for each channel, as we have measured those shifts [ TransformJ Translate]
**Use a nice interpolation method to avoid smashing the information in your images - eg quintic B-spline interpolation
**Or one could imaging imagine a Fourier based method... (using a phase shift?) The Fiji [Stitching_2D/3D]] plugin contains stuff that might do this but its not exposed in the GUIs... so scripting would be required there?
*Moving the images using a whole pixel or z plane shift will not be precise enough for high resolution colocalization analysis.
*At a higher level there may also be a magnification difference between the different colour channels.
**Further, On multi camera systems, like OMX or some HCS system there WILL be a rotation angle difference between the channel images.
**TransformJ Affine [ plugin] might do the job, which can also expand or shrink the image to accommodate different magnification and rotation.