Method for aligning two objects, method for detecting...

Image analysis – Applications – Manufacturing or product inspection

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C348S095000, C250S491100, C250S559300

Reexamination Certificate

active

06917698

ABSTRACT:
Disclosed is a case where an aligning method according to the present invention is applied to a probe apparatus. Target probes are photographed by an upper CCD camera and target electrode pads are photographed by a lower CCD camera. Second virtual images of the photographed probes and first virtual images of the electrode pads are displayed in second and first image data areas on a monitor screen. Dark and light colors in terms of brightness are applied to pixels of the second virtual image and the first virtual image. The second virtual images are moved on the monitor screen, so that the second virtual images are superimposed on the first virtual images. The total sum of the brightness (luminance) of all the first virtual images is calculated. On the basis of the calculated luminance value, a position where the target probes are most successfully brought into contact with the target pads is detected.

REFERENCES:
patent: 6002426 (1999-12-01), Back et al.
patent: 59-17260 (1984-01-01), None
patent: 5-13518 (1993-01-01), None

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for aligning two objects, method for detecting... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for aligning two objects, method for detecting..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for aligning two objects, method for detecting... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3433250

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.