Integrated alignment and calibration of optical system

Optics: measuring and testing – Shape or surface configuration – By focus detection

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C356S620000, C356S401000

Reexamination Certificate

active

06678058

ABSTRACT:

FIELD OF THE INVENTION
The present invention relates to the calibration of optical instruments for high precision machine vision inspection applications.
BACKGROUND OF THE INVENTION
In industrial deployment of robotic systems, there is in general a need to precisely determine the position of an object, typically a workpiece or semiconductor, so that the robotic system can align the workpiece with a second object, for instance, to a mating workpiece or to a tool. In certain applications, for example, the fabrication and testing of microelectronic circuits, this alignment must be performed with extreme precision, for example, to less than 1 micron (1×10
−6
meters). In these circumstances, an external alignment system is generally required. Typically, such a system is an optical, non-contact system known in the art as a machine vision system and, more specifically, as a machine vision alignment system.
In general, the alignment system will consist of a light source to illuminate the object if it is not self luminous, and a sensor to sense the emitted or reflected light. The captured information is either presented to a human operator or automatically analyzed by an associated processor.
In certain applications, there may be two (or more) such systems, with one system viewing the workpiece, and one viewing the tool. In general, each system will include appropriate light sources to illuminate each object, lens to focus the images, and sensors. Each system will typically be connected to a single processor and/or a single monitor. In practice, each system may be combined, in whole or in part, into one assembly to conserve space or cost, or both, so long as the combined assembly is still able to adequately view the workpiece and the tool either simultaneously or in turn.
In operation, the object, or certain features of the object apparent to the sensing system are detected, and their location determined relative to the sensing system. These features are known in the art as fiducial marks or fiducials. Fiducials may be defined by pre-existing features on the object or may be defined by marks artificially placed on the object.
The sensor may be a focal plane array sensor (for example a CCD sensor or a CMOS sensor) comprised of a array of picture elements (known in the art as pixels) and associated imaging optics. The location of the object, or the location of a fiducial on the object, may be determined as a function of the pixel location on the sensor. The location may be defined by a first number of pixels or fractional pixels from a first edge of the senor, and a second number of pixels or fractional pixels from a second edge of the sensor, the second edge being non-parallel to the first edge and, in general, orthogonal to the first edge.
It is well known in the art that for such a system to have merit, it is necessary to relate the measured parameters in image space, for example, a position measured by the pixel location on the sensor, to parameters in object, or real world space, for example, in millimeters at the workpiece or millimeters at the tool. The parameters in object space may be used, for example, to guide the robotically manipulated tool. It is understood that the object coordinates are not necessarily in millimeters and may be based on an artificial measurement scheme which may be native to the robotic system.
In theory, it is possible to create a mathematical transformation between measurements made on the sensor and the object or world coordinates. For example, one could characterize the dimensions of the pixels of the focal plane array sensor, the focal length of a lens, and the image distance and object distance of the system, and the precise location of the sensor and lens relative to the workpiece, and determine an image to object coordinate transformation. However, the errors in characterizing each component of the system will, in general, be cumulative in determining the coordinate transformation of the system.
In practice, it is generally more effective to coordinate the system empirically by measuring a known object with the system, and determine a coordinate transformation that relates the known parameters of the object to the coordinates of the sensing system. With this method, all of the relevant parameters of the alignment system can be determined with one operation. In the art, the known object is called a calibration object if it is substantially three-dimensional in nature, or a calibration fiducial if it is substantially two-dimensional in nature, for example, a mark made on a suitable object.
In the case where two sensing systems are used, the calibration object will present a target which can be viewed by each of the two systems, simultaneously or in turn, and provide a unique point of reference by which the coordinate systems of each system may be correlated to the coordinate systems of the object or objects and to one another. If it is not possible or not convenient to view a single unique target with each system, the calibration object will, in general, consist of two (or more) targets, one for each alignment system. The spatial relationship between the two (or more) targets will be precisely known, so that the coordinate systems of each system can be precisely correlated to one another.
In practice, a device employing a machine vision alignment system will be calibrated when it is first set up. It will be calibrated whenever any component of the system is adjusted or changed, for example, if the lens is changed, zoomed, or refocused. In general, it will be calibrated every time a new job is started. It is also standard practice to recalibrate the system periodically, such as every day or every shift, to correct for mechanical instabilities of the system, or thermally induced deformations. Typically, these calibrations are a non-productive phase of operation.
With reference to
FIG. 1
, there is shown the current state-of-the-art. The current state-of-the-art involves measuring the position of a calibration fiducial
10
in object or real world coordinates (x,y) with respect to a focal plane array sensor
12
in camera coordinates (p,q),which is considered to be mechanically fixed in place with respect to the imaging optics
14
. From the apparent position and size of the calibration fiducial
10
, as imaged on the focal plane array sensor
12
, the position of the camera and the magnification of the optics are inferred and stored within the computing device
16
portion of the machine vision system. This is possible because the size and position of the fiducial are known in object coordinate system (x,y) of the workpiece
18
. These values are used to transform the coordinates of a fiducial
20
on the workpiece
18
as measured on the focal plane array sensor
12
in camera coordinate (p,q) to object coordinate (x,y) in space of the workpiece
18
. This operation, (p,q)→(x,y), is known in the art as the image to world coordinate transformation.
If additional alignment systems are deployed, a similar operation may be performed on the additional focal plane array sensor(s)
22
and imaging optics
24
to measure the position of a fiducial
26
on a second object, for example, a robotic tool
28
.
Several patents refine this basic technique, including several that deal with calibration issues. For example, Woodhouse (U.S. Pat. No. 5,537,204) shows a workpiece being temporarily replaced with a chrome-on-glass fiducial target for the purpose of calibration. Dautartas (U.S. Pat. No. 5,257,336) shows placement of fiducials directly on the workpiece for the purpose of alignment and, in particular, the workpiece being a light emitting diode package and the tool holding an optical fiber to be aligned with the light emitting diode. Everett (U.S. Pat. No. 5,298,988) shows a virtual image of a fiducial optically projected to a point in space in place of a physical fiducial for calibration purposes.
All of these references consider the alignment system to be a separate closed system, for which calibration is performed externally. This approa

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Integrated alignment and calibration of optical system does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Integrated alignment and calibration of optical system, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Integrated alignment and calibration of optical system will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3239856

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.