Image analysis – Applications – 3-d or stereo imaging analysis
Reexamination Certificate
2000-04-28
2004-06-01
Patel, Jayanti K. (Department: 2625)
Image analysis
Applications
3-d or stereo imaging analysis
C382S285000, C345S419000, C356S012000, C348S066000
Reexamination Certificate
active
06744914
ABSTRACT:
FIELD OF THE INVENTION
The present invention relates generally to the mapping of objects, and more specifically to creating three-dimensional models of objects.
BACKGROUND OF THE INVENTION
The use of scanning techniques to map surfaces of objects is well known. Prior art 
FIG. 1
 illustrates an object 
100
 having visible surfaces 
101
-
104
. Generally, the visible surfaces 
101
-
103
 form a rectangular shape residing on top of a generally planer surface 
104
.
Projected onto the object 
100
 is an image, which includes the line 
110
. In operation, the image of line 
110
 is received by a viewing device, such as a camera, (not shown) and processed in order to determine the shape of that portion of object 
100
 where the line 
110
 resides. By moving the line 
110
 across the object 
100
, it is possible to map the entire object 
100
. Limitations associated with using an image comprising a single line 
110
 is that a significant amount of time is needed to scan the object 
100
 to provide an accurate map, and a fixed reference point is needed at either the scanner or the object.
FIG. 2
 illustrates a prior art solution to reduce the amount of time taken to scan an object. Specifically, 
FIG. 2
 illustrates an image including lines 
121
 through 
125
. By providing multiple lines, it is possible to scan a greater surface area at once, thus allowing for more efficient processing of data associated with the object 
100
. Limitations of using patterns such as are illustrated in 
FIG. 2
 include the need for a fixed reference point, and that the surface resolution capable of being mapped can be reduced because of the potential for improper processing of data due to overlapping of the discrete portions of the image.
In order to better understand the concept of overlapping, it is helpful to understand the scanning process. Prior art 
FIG. 3
 illustrates the shapes of 
FIGS. 1 and 2
 from a side view such that only surface 
102
 is visible. For discussion purposes, the projection device (not illustrated) projects a pattern in a direction perpendicular to the surface 
101
 which forms the top edge of surface 
102
 in FIG. 
3
. The point from the center of the projection lens to the surface is referred to as the projection axis, the rotational axis of the projection lens, or the centerline of the projection lens. Likewise, an imaginary line from a center point of the viewing device (not shown) is referred to as the view axis, the rotational axis of the view device, or the centerline of the view device, extends in the direction which the viewing device is oriented.
The physical relationship of the projection axis and the view axis with respect to each other is generally known. In the specific illustration of 
FIG. 3
, the projection axis and the view axis reside in a common plane. The relationship between the projection system and the view system is physically calibrated, such that the relationship between the projector, and the view device is known. Note the term “point of reference” is to describe the reference from which a third person, such as the reader, is viewing an
FIG. 4
 illustrates the object 
100
 with the image of 
FIG. 2
 projected upon it where the point of reference is equal to the projection angle. When the point of reference is equal to the projection angle, no discontinuities appear in the projected image. In other words, the lines 
121
-
125
 appear to be straight lines upon the object 
100
. However, where the point of reference is equal to the projection axis, no useful data for mapping objects is obtained, because the lines appear to be undistorted.
FIG. 5
 illustrates the object 
100
 from a point of reference equal to the view angle fleet of FIG. 
2
. In 
FIG. 5
, the surfaces 
104
, 
103
 and 
101
 are visible because the view axis is substantially perpendicular to the line formed by surfaces 
101
 and 
103
, and is to the right of the plane formed by surface 
102
, see 
FIG. 2
, which is therefore not illustrated in FIG. 
5
. Because of the angle at which the image is being viewed, or received by the viewing device, the lines 
121
 and 
122
 appear to be a single continuous straight line. Likewise, line pairs 
122
 and 
123
, and 
123
 and 
124
, coincide to give the impression that they are single continuous lines. Because line 
125
 is projected upon a single level surface elevation, surface 
104
, line 
125
 is a continuous single line.
When the pattern of 
FIG. 5
 is received by a processing device to perform a mapping function, the line pairs 
121
 and 
122
, 
122
 and 
123
, and 
123
 and 
124
, will be improperly interpreted as single lines. As a result, the two-tiered object illustrated in 
FIG. 2
 may actually be mapped as a single level surface, or otherwise inaccurately displayed because the processing steps can not distinguish between the line pairs.
FIG. 6
 illustrates a prior art solution for overcoming the problem described in FIG. 
5
. Specifically, 
FIG. 6
 illustrates the shape 
100
 having an image projected upon it whereby a plurality of lines having different line widths, or thickness, are used. 
FIG. 7
 illustrates the pattern of 
FIG. 6
 from the same point of reference as that of FIG. 
5
.
As illustrated in 
FIG. 7
, it is now possible for a processing element analyzing the received data to distinguish between the previously indistinguishable line pairs. Referring to 
FIG. 7
, line 
421
 is still lined up with line 
422
 to form what appears to be a continuous line. However, because line 
421
 and line 
425
 have different thickness, it is now possible for an analysis of the image to determine the correct identity of the specific line segments. In other words, the analysis of the received image can now determine that line 
422
 projected on surface 
104
, and line 
422
 projected on surface 
101
 are actually a common line. Utilizing this information, the analysis of the received image can determine that a step type feature occurs on the object being scanned, resulting in the incongruity between the two segments of line 
422
.
While the use of varying line thickness, as illustrated in 
FIG. 7
, assists identifying line segments, objects that have varying features of the type illustrated can still result in errors during the analysis of the received image.
FIG. 8
 illustrates from a side point of reference a structure having a surface 
710
 with sharply varying features. The surface 
710
 is illustrated to be substantially perpendicular to the point of reference of FIG. 
8
. In addition, the object 
700
 has side surfaces 
713
 and 
715
, and top surfaces 
711
 and 
712
. From the point of reference of 
FIG. 8
, the actual surfaces 
711
, 
712
, 
713
 and 
715
 are not viewed, only their edges are represented. The surface 
711
 is a relatively steep sloped surface, while the surface 
712
 is a relatively gentle sloped surface.
Further illustrated in 
FIG. 8
 are three projected lines 
721
 through 
723
 having various widths. A first line 
721
 has a width of four. A second projected line 
722
 has a width of one. A third projected line 
723
 has a width of eight.
The line 
721
, having a width of four, is projected onto a relatively flat surface 
714
. Because of the angle between the projection axis and the view axis, the actual line 
721
 width viewed at the flat surface 
714
 is approximately two. If the lines 
722
 and 
723
 where also projected upon the relatively flat surface 
714
 their respected widths would vary by approximately the same proportion amount as that of 
721
, such that the thickness can be detected during the analysis steps of mapping the surface. However, because line 
722
 is projected onto the angled surface 
711
, the perspective from the viewing device along the viewing axis is such that the line 
722
 has a viewed width of two.
Line 
722
 appears to have a width of two because the steep angle of the surface 
710
 allows for a greater portion of the projected line 
722
 to be projected onto a greater area of the surface 
711
. It is this greater area of the surface 
722
 that is viewed to give the percepti
Rubbert Rudger
Sporbert Peer
Weise Thomas
Carter Aaron
McDonnell & Boehnen Hulbert & Berghoff
OraMetrix Inc.
Patel Jayanti K.
LandOfFree
Method and system for generating a three-dimensional object does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Method and system for generating a three-dimensional object, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and system for generating a three-dimensional object will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-3359610