Determining a depth

Image analysis – Image transformation or preprocessing – Mapping 2-d image onto a 3-d surface

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S106000, C356S370000

Reexamination Certificate

active

06269197

ABSTRACT:

BACKGROUND OF THE INVENTION
The invention relates to determining a depth.
Determining a depth or a distance, also known as range sensing, is important in, e.g., industrial applications such as measurement of solder paste volume in manufacturing surface-mounted electronic assemblies, digitization of three-dimensional (“3-D”) clay models, and inspection of semiconductor packages for lead coplanarity.
Electronic devices that have traditional electrical terminals (e.g., dual in-line package leads) may be inspected in two dimensions using backlight. However, some devices that have other types of electrical terminals such as ball grid array balls cannot be effectively inspected using backlight. Instead these devices are inspected by an imaging system that can view the terminals in three dimensions to check for compliance with specifications for, e.g., height, volume, and shape.
Optical systems have been devised that allow 3-D images to be derived from two-dimensional (“2-D”) images, by exploiting optical principles that relate the extent to which a surface is out-of-focus to a distance between the surface and an in-focus point. With such systems, resulting depth information in each of the derived 3-D images has only a fraction of the resolution of each of the 2-D images (e.g., only 512×512 points of depth information from 1024×1024 2-D images). Thus, to derive depth information having only moderate resolution, such systems are compelled to use costly high-resolution cameras that can produce the necessary high-resolution 2-D images.
SUMMARY OF THE INVENTION
In general, in one aspect, the invention features a method for deriving a three-dimensional image from two-dimensional images, at least one of the two-dimensional images having a predetermined number of pixels. The method includes: deriving focus-based depth measurements from the two-dimensional images, the number of derived focus-based depth measurements being substantially equal to the predetermined number of pixels; and from the two-dimensional digital images and the depth measurements, deriving the three-dimensional image.
Implementations of the invention may include one or more of the following features. The method may further include deriving a focus measurement for each pixel in the predetermined number of pixels. The method may further include deriving each focus-based depth measurement from the focus measurements. The three-dimensional image may include information about a subject and each of the two-dimensional images may include different information about the subject. The method may further include imposing structured illumination on the subject and producing the two-dimensional images from the subject under the structured illumination. The method may further include, for each of the two-dimensional images, imposing different structured illumination on the subject. Each instance of imposed structured illumination may include a same pattern having a spatial period, and, for the instances of imposed structured illumination, the respective positions of the pattern relative to the subject may differ by a fraction of the spatial period. The fraction may include a multiple of a quarter of the spatial period. The subject may bear a contrast pattern, and the method may further include, in the derivation of the focus-based depth measurements, excluding at least some information about the contrast pattern.
In general, in another aspect, the invention features a method for use in determining a depth. The method includes taking three sample values of a characteristic of an area defined by a pixel, each sample value corresponding to a different position of structured illumination relative to the area.
Implementations of the invention may include one or more of the following features. The structured illumination may include a pattern having a spatial period, and each of the different positions may differ by a fraction of the spatial period. The method may further include, from the three sample values, deriving a computed value representing a result of illuminating the area with unstructured illumination. The method may further include, from the three sample values and the computed value, deriving normalized sample values representing respective results of illuminating the area with the structured illumination at positions differing by the fraction of the spatial period. The method may further include, from the normalized sample values, deriving a focus measure representing a relative distance. The spatial period may be at least as long as four times a length defined by the pixel.
In general, in another aspect, the invention features a method for use in processing a digital image. The method includes: imposing structured illumination on a subject, producing two-dimensional images from the subject under the structured illumination, at least one of the two-dimensional images having a predetermined resolution, each of the two-dimensional images including different information about the subject, from the two-dimensional digital images, deriving focus measurements, from the focus measurements, deriving depth measurements, and from the two-dimensional digital images and the depth measurements, deriving a three-dimensional image having the predetermined resolution.
In general, in another aspect, the invention features a method for use in determining a depth. The method includes: taking three sample values of a characteristic of an area defined by a pixel, each sample value corresponding to a different position of structured illumination relative to the area, the structured illumination including a pattern having a spatial period, each of the different positions differing by a fraction of the spatial period; from the three sample values, deriving a computed value representing a result of illuminating the area with unstructured illumination; from the three sample values and the computed value, deriving normalized sample values representing respective results of illuminating the area with the structured illumination at positions differing by the fraction of the spatial period; and from the normalized sample values, deriving a focus measure representing a relative distance.
Among the advantages of the invention are one or more of the following. A CCD sensor having a 512×512 pixel array can be used to produce a 512×512 depth image (i.e., 512×512 points of depth information). With little or no modification to hardware, some existing systems for producing depth images can be upgraded to produce higher-resolution depth images. A depth image can be created by using a simple patterned mask that changes sinusoidally from dark to light. Large groups of electrical terminals can be effectively inspected by using conventional CCD sensors.
Other advantages and features will become apparent from the following description and from the claims.


REFERENCES:
patent: 4472056 (1984-09-01), Nakagawa et al.
patent: 4640620 (1987-02-01), Schmidt
patent: 4689480 (1987-08-01), Stern
patent: 4876455 (1989-10-01), Sanderson et al.
patent: 4893183 (1990-01-01), Nayar
patent: 4912336 (1990-03-01), Nayar et al.
patent: 4984893 (1991-01-01), Lange
patent: 4988202 (1991-01-01), Nayar et al.
patent: 5151609 (1992-09-01), Nakagawa et al.
patent: 5239178 (1993-08-01), Derndinger et al.
patent: 5248876 (1993-09-01), Kerstens et al.
patent: 5424835 (1995-06-01), Cosnard et al.
patent: 5546189 (1996-08-01), Svetkoff et al.
patent: 5589942 (1996-12-01), Gordon
patent: 5617209 (1997-04-01), Svetkoff et al.
patent: 5642293 (1997-06-01), Manthey et al.
patent: 5659420 (1997-08-01), Wakai et al.
patent: 5879152 (1999-03-01), Sussman et al.
patent: 5900975 (1999-05-01), Sussman
patent: 5912768 (1999-06-01), Sissom et al.
patent: 5930383 (1999-02-01), Netzer
patent: 6025905 (2000-02-01), Sussman
patent: 3413605 A1 (1985-10-01), None
patent: 0 183 240 A2 (1986-06-01), None
patent: 0 300 164 A1 (1989-01-01), None
patent: 0 563 829 A2 (1993-10-01), None
patent: 0 627 610 A1 (1994-12-01), None
patent: 8-233544 (1985-02-01), None
patent: 3-63507 (1989-08-01), N

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Determining a depth does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Determining a depth, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Determining a depth will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2475989

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.