Image analysis – Applications – 3-d or stereo imaging analysis
Reexamination Certificate
1997-06-30
2001-04-24
Boudreau, Leo H. (Department: 2721)
Image analysis
Applications
3-d or stereo imaging analysis
C345S111000, C345S182000, C345S215000
Reexamination Certificate
active
06222937
ABSTRACT:
TECHNICAL FIELD
The invention relates to a computer imaging system and, more particularly, to a method and system for representing a complete 3-dimensional image of an object.
BACKGROUND OF THE INVENTION
Current computer graphics systems create images of graphics objects by first modeling the geometric and surface attributes of the objects in a virtual environment along with any light sources. The graphics systems can render images of the object in the virtual environment from the vantage point of a virtual camera. Great effort has been expended to develop computer graphics systems that allow objects with complex geometries and material attributes to be modeled. Also, a great effort has been expended to produce graphics systems that simulate the propagation of light through virtual environments to create realistic images.
Such modeling of objects typically involves generating a 3-dimensional representation of the objects using, for example, a polygon mesh to represent the surface of the object. Each polygon in a polygon mesh has vertices in space that define the perimeter of that polygon. Each polygon, thus, corresponds to a small portion of the surface of the object being modeled. To increase the resolution of the modeling, the number of polygons can be increased to represent minute nuances in the shape of the object.
The rendering of an image from such a polygon mesh can be computationally intensive and may not accurately represent the image of the real object in a real environment. The rendering is computationally intensive because for each polygon that corresponds to a portion of the object that is visible in the image, the graphics system would first need to identify that polygon as visible, determine the effects of the light sources, and then apply those effects to the image. The processing at each step can be computationally intensive. Moreover, such graphics systems have had very little success in determining the effects of a light source on the object. A complex object can be made of many different materials and there has been little success in accurately modeling the reflective characteristics of various materials.
Therefore, it has remained difficult or impossible to recreate much of the complex geometry and subtle lighting effects found in the real world. To bypass the modeling problem, recently there has been interest in capturing the geometry, material properties, and motion of objects directly from the real world. This approach typically involves some combination of cameras, structured light, range finders, and mechanical sensing devices such as 3-dimensional digitizers and associated software. When successful, the results can be fed into a rendering program to create images of real objects and scenes. Unfortunately, these systems are still unable to completely capture small details in geometry and material properties. Existing rendering methods also continue to be limited in their capability to faithfully reproduce real world illumination, even if given accurate geometric models.
In certain systems, the traditional modeling/rendering process has been skipped. Instead, these systems capture a series of environment images and allow a user to look around an object from fixed vantage points. Although these captured images accurately represent the object from the fixed vantage points, the usefulness of such systems are limited because the objects can only be viewed from fixed vantage points. Attempts to interpolate an image of the objects from other vantage points have generally proved unsuccessful. It would be desirable to have a system for efficiently representing the complete appearance of an object so that accurate images of an object can be rendered from arbitrary vantage points.
SUMMARY OF THE INVENTION
The present invention provides a method and system for 3-dimensional imaging of an object. The system collects images of the object from a plurality of vantage points. For each of a plurality of points on a surface surrounding the object, the system determines based on the collected images the intensity value and direction of light that emanates from the object and passes through the point for each of a plurality of directions. The system then stores each determined intensity value in a data structure indexed by the point on the surface and by the direction. From this 4-dimensional representation of the appearance of the object, the system can render images of the object from arbitrary vantage points. For each pixel of the image to be rendered, the system interpolates the intensity value of that pixel from the stored determined intensity values.
In another aspect, the system generates coefficients for a plenoptic function describing light that passes through each point on a closed surface. The system selects an overall direction of interest as inward or outward to indicate whether an object inside the surface or a scene outside the surface is to be imaged. The system then selects a plurality of points on the surface. For each selected point on the surface, the system estimates the intensity value of a light ray that passes through that point in the selected overall direction for each of a plurality of directions in the selected overall direction.
In yet another aspect, the invention comprises a computer-readable medium that contains a data structure for representing an appearance of an object. The data structure has for each of a plurality of points surrounding the object and for each of a plurality of directions, an intensity value representing light that emanates from the object in that direction and passes through the point.
In the following, term Lumigraph function refers to a 4-dimensional plenoptic function that represents the complete appearance of an object. The term Lumigraph data structure refers to a data structure that contains the coefficients of the Lumigraph function. The term Lumigraph point refers to a pre-defined vantage point on a surface surrounding an object, and the term Lumigraph direction refers to a pre-defined direction from a Lumigraph point.
The file of this patent contains at least one drawing executed in color. Copies of this patent with the color drawings will be provided by the Patent and Trademark Office upon request and payment of the necessary fee.
REFERENCES:
patent: 3613539 (1971-10-01), Dudley
patent: 3958078 (1976-05-01), Fowler et al.
patent: 4202008 (1980-05-01), King
patent: 4238828 (1980-12-01), Hay et al.
patent: 4672564 (1987-06-01), Egli et al.
patent: 4678329 (1987-07-01), Lukowski, Jr. et al.
patent: 4753569 (1988-06-01), Pryor
patent: 4831659 (1989-05-01), Miyaoka et al.
patent: 4834531 (1989-05-01), Ward
patent: 4878735 (1989-11-01), Vilums
patent: 4928250 (1990-05-01), Greenberg et al.
patent: 4933864 (1990-06-01), Evans, Jr. et al.
patent: 4988202 (1991-01-01), Nayar et al.
patent: 5228098 (1993-07-01), Crinon et al.
patent: 5283560 (1994-02-01), Barlett
patent: 5422987 (1995-06-01), Yamada
patent: 5467404 (1995-11-01), Vuylsteke et al.
patent: 5488700 (1996-01-01), Glassner
patent: 5502482 (1996-03-01), Graham
patent: 5515447 (1996-05-01), Zheng et al.
patent: 5521724 (1996-05-01), Shires
patent: 5550758 (1996-08-01), Corby, Jr. et al.
patent: 5600368 (1997-02-01), Matthews, III
patent: 5613048 (1997-03-01), Chen et al.
patent: 5649032 (1997-07-01), Burt et al.
patent: 5659630 (1997-08-01), Forslund
patent: 5675377 (1997-10-01), Gibas
patent: 5696552 (1997-12-01), Aritake et al.
patent: 5717782 (1998-02-01), Denneau, Jr.
patent: 5724743 (1998-03-01), Jackson
patent: 5727078 (1998-03-01), Chupeau
patent: 5731902 (1998-03-01), Williams et al.
patent: 5748505 (1998-05-01), Greer
patent: 5751843 (1998-05-01), Maggioni et al.
patent: 5754317 (1998-05-01), Nakagawa et al.
patent: 5768443 (1998-06-01), Michael et al.
patent: 5825483 (1998-10-01), Michael et al.
patent: 5825666 (1998-10-01), Freifeld
patent: 5831619 (1998-11-01), Nakagawa et al.
patent: 5831735 (1998-11-01), Corby, Jr.
patent: 5832139 (1998-11-01), Batterman et al.
patent: 5850352 (1998-12-01), Moezzi et al.
patent: 5850469 (1998-12-01), Martin et
Cohen Michael F.
Grzeszczuk Radek
Boudreau Leo H.
Microsoft Corporation
Werner Brian P.
Woodcock Washburn Kurtz Mackiewicz & Norris LLP
LandOfFree
Method and system for tracking vantage points from which... does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Method and system for tracking vantage points from which..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and system for tracking vantage points from which... will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-2452105