Image analysis – Applications – Manufacturing or product inspection
Reexamination Certificate
2000-07-28
2004-10-12
Mehta, Bhavesh M. (Department: 2625)
Image analysis
Applications
Manufacturing or product inspection
C382S209000
Reexamination Certificate
active
06804387
ABSTRACT:
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a shading pattern matching apparatus for inspecting marks printed illustratively on the surface of semiconductor device packages and, more particularly, to an apparatus for modifying a maximum score value of pattern matching and for reaching a maximum score value at high speed.
2. Description of the Background Art
Shading pattern matching through the use of TV camera pictures is a well-known method for illustratively inspecting printed marks on semiconductor packages or like objects. This pattern matching method is used in diverse applications. A typical use of the method is outlined below.
FIG. 1
is a schematic view of devices configured to implement this type of shading pattern matching.
(1) In
FIG. 1
, a TV camera
2
takes pictures of a printed face of an object, illustratively a top of a semiconductor device
1
. In the setup of
FIG. 1
, the field of view of the TV camera
2
is adjusted so that a minimum printed line width is composed of at least three pixels.
(2) A video signal from the TV camera
2
is converted to digital format before being written to a memory of a personal computer
3
. Digital data are written to the memory in such a manner that they may be easily retrieved by the personal computer
3
. Typically, a TV screen is defined by Y and X coordinates representing the longitudinal and lateral directions respectively, and the brightness of each pixel is read out in terms of a two-dimensional array expressed illustratively as br[y][x] (br=bright). When read out, memory contents may be converted back to the analog signal for display onto a TV monitor
5
. The personal computer
3
is furnished with a mouse
6
or like means for interfacing with an operator.
(3) A picture of the first of a series of devices
1
with the same characters printed thereon is initially taken. The operator drags the mouse
6
to circumscribe an area that comprises the printed characters. Image processing software on the personal computer
3
identifies the individual characters in the area circumscribed by dragging the mouse. Steps (3.1) through (3.3) below are carried out to identify the characters.
(3.1) A maximum, a minimum, and an intermediate level of brightness in the circumscribed area are obtained.
(3.2) Within the circumscribed area, a maximum level of brightness is obtained on each of rows which extend lateral and which are followed from top to bottom. Any row whose maximum brightness level exceeds the intermediate level acquired in step (3.1) is judged to have character parts. Accomplishing this step reveals the beginning and end (e.g., limits of a range) of a character string as viewed longitudinally.
(3.3) Within the character string thus detected, a maximum level of brightness is obtained on each of columns which extend longitudinally and which are followed from left to right. Any column whose maximum brightness level exceeds the intermediate level is judged to have a character. Accomplishing these steps defines the vertical and horizontal limits of each character.
(4) The brightness level within the range of each character on a two-dimensional plane is recorded in the form of a template. The recording is made for all characters constituting the character string.
(5) A picture of the next device is taken.
(6) In considering repeatability of the device position and printed character position on the devices, maximum allowances are set for a predictable allowance in character position between the first package and the next package.
(7) The template is applied to each of the input pictures within the maximum predictable allowance range, whereby the most likely position of coincidence between the picture and the template is obtained. Specific steps constituting the processing of this pattern matching method are discussed below.
FIG. 8
is a flowchart of the steps in question.
(7.1) The maximum score value is initialized to zero. Then, a maximum allowance range is set around the position in which a first character has appeared in the picture of the first device. The longitudinal and lateral limits are determined.
(7.2) In the first position, a matching score is computed using the expression (1) below. The procedure is generally known as the shading pattern matching method based on normalized correlation:
r
2
={n&Sgr;fg−&Sgr;f&Sgr;g}
2
/{n&Sgr;f
2
−(&Sgr;
f
)
2
}{n&Sgr;g
2
−(&Sgr;
g
)
2
} (1)
where, r
2
stands for a score (degree of coincidence), “f” for the brightness of each pixel in an input picture, “g” for the brightness of each pixel in the template, and “n” for the number of effective pixels in the template.
(7.3) If the score obtained by the computations above is judged greater than the initially established maximum score value, then the maximum score value is updated to reflect the acquired score. The longitudinal and lateral coordinates of the position in effect at that point are recorded.
(7.4) Within the range of lateral scan, the current position is advanced right one pixel, and the computations of (7.2) and (7.3) above are repeated.
(7.5) If the range of lateral scan is found exceeded, a check is made to see if the current position is still within the range of longitudinal scan. If the current position is judged to be within the longitudinal scan range, the position is moved down one pixel, and lateral scan is repeated from the leftmost position toward the rightmost position.
(7.6) When the longitudinal scan range is judged to be exceeded, the processing is brought to an end.
(8) The maximum score value eventually obtained above represents the degree of coincidence of the first character between the first device and the second device. The higher the degree of coincidence, the greater the score value.
(9) The second character on the first device is then used as a template for matching the next character. This step is the same as the process (7) above.
(10) Likewise, the third and subsequent characters are used as templates for matching the respective characters. If the lowest value of the maximum score for a given character is judged lower than a predetermined threshold value, then the second device is judged to be faulty.
(11) The same inspection is carried out on a third device. The processing above is continued until the devices having the same characters have been exhausted.
There are major problems, outlined in (A) and (B) below, in conducting the above-described shading pattern matching method.
(A)
FIGS. 9
a
through
9
c
show a template, an example of a template picture coinciding with a picked-up picture, and an example of the template picture diverging from a picked-up picture.
FIG. 9
d
depicts part of an alphabetic character “F” coinciding with the template. The character line is three pixels wide as described earlier. More specifically,
FIG. 9
a
illustrates part of a template, and
FIG. 9
b
indicates part of a character in which the corresponding template coincides exactly with a picked-up picture. In the picture of
FIG. 9
c
taken by TV camera, the template is shown diverging lateral by half-pixel from the picture being inspected. The densely shaded portion in rectangular portions in
FIG. 9
c
represents a pixels 100 percent bright, while the portions thinly shaded by diagonal lines denote pixels 50 percent bright.
FIGS. 10A and 10B
are graphic representations showing how the score typically changes when a template is shifted lateral one pixel at a time. As illustrated,
FIG. 10A
includes a high score at a point where the template coincides with the picture 100 percent whereas
FIG. 10B
indicates points of mismatch with low scores. The phenomenon is an error resulting from varying conditions under which pictures are taken by the TV camera.
Even simple checks of relatively distinct uppercase alphabetic characters and numeric characters are not immune to the above kind of inaccuracy. Distinction of characters with limited differences therebetween, such as “Q”
Ijichi Toshiya
Sakaue Yoshikazu
Chawan Sheela
McDermott Will & Emery LLP
Mehta Bhavesh M.
Renesas Technology Corp.
LandOfFree
Pattern matching apparatus does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Pattern matching apparatus, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Pattern matching apparatus will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-3279185