Image analysis – Image enhancement or restoration – Artifact removal or suppression
Reexamination Certificate
2001-09-20
2004-12-07
Mehta, Bhavesh M. (Department: 2625)
Image analysis
Image enhancement or restoration
Artifact removal or suppression
C382S260000, C382S274000, C382S252000, C358S003260, C358S003270, C358S463000
Reexamination Certificate
active
06829393
ABSTRACT:
BACKGROUND OF THE INVENTION
The present invention is directed to a method, program and apparatus for removing stray-flux effects in an image.
All imaging systems are victimized by phenomena that misdirect a small portion of the entering flux to undesired locations in the image. The term “imaging systems” as used herein includes, for example, optical systems, x-ray systems, and computerized tomography systems. Other systems that do not employ the directional passage of flux, such as magnetic resonance imaging systems, also fall within the term “imaging systems”. Depending on the type of imaging system, the “image” may have a single dimension (a linear image), two dimensions (a planar image), three dimensions (a volume image), or more dimensions (a hyper image). In the most general sense the misdirected flux may be termed “stray-flux”, although in the context of different systems, it is known by different terms.
Optical imaging systems are victimized by phenomena that misdirect a small portion of the entering light flux to undesired locations in the image plane. Among other possible causes, these phenomena include: 1) Fresnel reflection from optical-element surfaces, 2) diffraction at aperture edges, 3) scattering from air bubbles in transparent glass or plastic lens elements, 4) scattering from surface imperfections on lens elements, and 5) scattering from dust or other particles. The misdirected flux, which is called by the terms “stray light”, “lens flare”, “veiling glare”, and by other names, degrades both the contrast and the photometric accuracy of the image. For example, in photography, back-lighted scenes such as portraits that contain a darker foreground object, suffer from poor contrast and reduced detail of the foreground object.
Perhaps less known and appreciated is the effect that “stray light” has on color accuracy of an image. P. A. Jansson and R. P. Breault, in “Correcting Color Measurement Error Caused by Stray Light in Image Scanners” published in Proceedings of The Sixth Color Imaging Conference: Color Science Systems and Applications, 1998, Society of Image Science and Technology, referred to a traditional simple manipulation of offset and gain, or contrast, that can lessen subjective objections to stray-light contaminated images. These manipulations, however, do not correct the underlying flaw in the image and do, in fact, introduce additional error.
U.S. Pat. No. 5,153,926 (Jansson et al.), assigned to the assignee of the present invention, describes various embodiments of a method to remove the stray-light effect from images. This method demands significant amounts of computation which might inhibit application within an image acquisition apparatus, such as a digital camera.
The method of the present invention, however, may be implemented in relatively simple apparatus, such as one or more digital-logic integrated circuits within a digital camera, and can quickly and inexpensively correct images that have been degraded by stray light.
Magnitudes of the Stray Flux Effect The importance of the stray-flux effect in general and the stray-light effect in particular may appreciated from the following discussion of the magnitude of the effect of stray light on an optical imaging system. In an image arrangement
10
as shown in
FIG. 1
, a transparent mask
2
having an opaque spot
3
may be placed on a light-box
1
having a source of uniform diffuse illumination
4
. A lens
5
is then used to image this back-illuminated spot
3
and its surround onto a target
6
comprised of a charge-couple-device (CCD) array of photodetector elements
6
E, such as that now found in a digital camera. This arrangement creates an opportunity for light from the surrounding diffuse source
4
to stray into the center of the spot
7
imaged on the target
6
. An opaque spot
3
having a lateral dimension of 12 percent of the surrounding field of view thus covers about 1.44% of the image area. An example of an implementation of the image arrangement
10
has shown that the center of such an opaque spot
7
image on the target plane
6
, that should receive no light flux, actually receives light flux that corresponds to about 2% of the adjacent surround. An idealized densitometer should indicate infinite (or indeterminate) optical density at the center of the spot in the absence of the stray flux. However, the 2% stray-contaminated data obtained from this imaging system results in an erroneous optical density value of D=log (1.0/0.02)≈1.7.
Area-imaging systems collect light from all points in the image in parallel, in contrast to scanning imaging systems that collect light from each point in the image serially. From the above example, it may be appreciated that area-imaging optical systems have limited use when applied to the measurement of optical densities. Best practice in optical density measurement calls for physically restricting the number of paths accessible to stray flux by using apertures, masks, baffles, special coatings and other techniques known to optical designers.
Scanning methods that image a scene in a point-by-point manner can overcome the inaccuracy of single-exposure entire-image parallel detection. The serial acquisition of the image by mechanical scanning methods, however, is impractical for many applications, especially photography. The need to place an aperture or mask close to the object plane imposes another unacceptable physical limitation for photography. The greatly increased exposure time owing to loss of parallel detection further imposes an unacceptable time limitation for photography. A photography camera requires a degree of separation between the object and the optics/detector system and requires parallel image detection to be practical.
Color Accuracy in Photography and Imaging Colorimetry P. A. Jansson and R. P. Breault, in the earlier-cited publication consider the image arrangement of
FIG. 1
, in a situation where a colored image spot is surrounded by a region having a different color, to determine how the color at the center of such a spot is affected by stray light from its surround. The perceptually uniform CIE color space denoted by the L*a*b* coordinate system is employed. In this system a difference value of about 1.0 is considered a just-noticeable-difference to a typical human observer. A blue spot having 5% contamination of green from a surround is shown to exhibit a color error of &Dgr;E
L*a*b*
=8.94. Here the blue was specified to have the L*a*b* coordinates (10, 10, 30) and the green surround had L*a*b* coordinates (10, 30, 10). This shift of nearly nine just-noticeable-difference units, however, was small compared to the result computed when more saturated values of blue, having L*a*b* coordinates (6.5, 2.5, 40), and green, having L*a*b* coordinates (6.5, 40, 2.5), were used, even with a much lower contamination level of 2%. In this case, the color error was gross, giving &Dgr;E
L*a*b*
=15.07. Similarly, in a realistic black and white image, they determined that a 4% contamination of a black spot having 5% reflectance resulted in an error of &Dgr;E
L*a*b
=9.09.
Earlier-cited U.S. Pat. No. 5,153,926 may be applied to the removal of stray-light effects from scanned images acquired by densitometers, cameras, microscopes, telescopes and other optical imaging systems. The method of this patent contemplates computations in both the Fourier and spatial domains. They require the concept of a full point-spread function (PSF) that characterizes the redistribution of light flux in the image plane occasioned by the presence of scattering phenomena and other stray-light-producing effects in the optical system. Characterization of stray-light by a point-spread function is not typically done because it describes the spreading of flux over large distances of image coordinate. This contrasts with common usage in which the point-spread function incorporates primarily diffraction and geometrical-optics aberrations. Only the central core of a camera's full point-spread function would correspond to the more tradit
Kassa Yosef
Mehta Bhavesh M.
LandOfFree
Method, program and apparatus for efficiently removing... does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Method, program and apparatus for efficiently removing..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method, program and apparatus for efficiently removing... will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-3306275