Image processing method and apparatus

Computer graphics processing and selective visual display system – Computer graphics processing – Three-dimension

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S426000

Reexamination Certificate

active

06256035

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of the Invention
The invention relates to image processing method and apparatus and storage medium for forming an image at an arbitrary visual point position by using an image group obtained by photographing an object at different visual point positions.
2. Related Background Art
In case of generating an image at an arbitrary visual point position by using an actual picture image group photographed at a plurality of visual point positions, there is a method whereby data of the actual photographed image group has previously been converted into light space data and the data is sampled from the light space, thereby generating an arbitrary visual point image.
First, a concept of the light space will now be explained. In a three-dimensional space, a light is emitted by a light source or a reflection light of an object. The light crossing a certain point in the three-dimensional space is unconditionally determined by five variables indicative of its position (x, y, z) and direction (&thgr;, &phgr;). When a function showing a light intensity of the light is defined as f, light group data in the three-dimensional space is expressed by f(x, y, z, &thgr;, &phgr;). When considering a time change of the light group data, the light group data is expressed by f(x, y, z, &thgr;, &phgr;; t) and the light group in the three-dimensional space is described as a six-dimensional space. This space is called a light space. Since the normal two-dimensional image is considered as an image obtained by recording a light group gathering at a single visual point, it is possible to consider that the two-dimensional image is an image obtained by recording two-dimensional data of
f(&thgr;, &phgr;)|x=x0, y=y0, z=z0, t=t0
In this instance, an attention is paid to a light group which passes a plane in the case where Z=z at t=0. The plane is called a reference plane. Now assuming that a horizontal plane (X-Z plane) that is perpendicular to the Y axis is considered and a parallax in the vertical direction is not considered (y=0, &phgr;=0), a real space is as shown in FIG.
11
. The light group emitted from the reference plane is described as f(x, &thgr;) by using two variables of a position x and an angle &thgr;. Therefore, with respect to the light group which passes a certain point (X, Z) in the real space, a relation expressed by the following equation is satisfied.
X=
x
+Z·tan &thgr;  (1)
When a variable such as u=tan &thgr; is defined, the equation (1) is as follows.
 X=
x+u
Z  (2)
When the light group observed at a visual point position of (X, Z) is projected to an (x−u) space of the light space, a straight locus is obtained as shown in FIG.
12
. If the (x−u) plane of the light space is filled with the straight loci on the basis of the images photographed at many visual point positions, an image at an arbitrary visual point position can be generated by sampling the light space data from the (x−u) plane along the straight line based on the equation (2).
In the prior art, however, an arithmetic operation for converting all picture elements of the photographed image into a light group has been performed. That is, assuming that there are (E) photographed images and the number of picture elements of each image is equal to (m×n), the picture elements are converted into a light group by performing the calculation (E×m×n) times, so that there is a problem such that an amount of calculation is extremely large.
When the light group is projected to the light space so as to keep a resolution of an input image and the light space data is discreted, there is a problem such that an amount of discreted data is enormously large.
Since the discreted light space data includes data in which no value exists (undefined), there is a problem such that even when the light space data is again sampled in order to generate an arbitrary visual point image, a desired image cannot be generated. In order to avoid such a problem, when a picture element whose value is undefined is sampled, a method whereby the nearest data having a value is obtained and such an undefined value is substituted by this data is also considered. When this method is used, however, there is a problem such that it takes a surplus time to generate the arbitrary visual point image.
SUMMARY OF THE INVENTION
The invention is made in consideration of the above problems.
According to the invention, there is provided an image processing method comprising; an image inputting step of inputting an image group photographed at a plurality of visual point positions; a projecting step of projecting a part of the inputted image group to a light space; a correspondence relation forming step of forming a correspondence relation between the projected image and the light space; an image generating step of generating an image at an arbitrary visual point position on the basis of the formed correspondence relation; and an image displaying step of displaying the generated image.
According to the invention, there is provided an image processing apparatus comprising: image input means for inputting an image group photographed at a plurality of visual point positions; projecting means for projecting a part of the inputted image group to a light space; correspondence relation forming means for forming a correspondence relation between the projected image and the light space; image generating means for generating an image at an arbitrary visual point position on the basis of the formed correspondence relation; and image display means for displaying the generated image.
According to the invention, there is provided a storage medium in which a computer program to realize an image processing method characterized by comprising an image inputting step of inputting an image group photographed at a plurality of visual point positions, a projecting step of projecting a part of the inputted image group to a light space, a correspondence relation forming step of forming a correspondence relation between the projected image and the light space, an image generating step of generating an image at an arbitrary visual point position on the basis of the formed correspondence relation, and an image displaying step of displaying the generated image has been stored and which can be read by a computer.


REFERENCES:
patent: 5805782 (1998-09-01), Foran
patent: 5886704 (1999-03-01), Kang et al.
patent: 5894309 (1999-04-01), Freeman et al.
patent: 5933146 (1999-08-01), Wrigley
Computer Graphics Proceedings, Annual Conference Series, “Light Field Rendering”, M. Levoy, et al., pp. 31-42 (1996).
Computer Graphics Proceedings, Annual Conference Series, “The Lumigraph”, S. Gortler, et al., pp. 43-54 (1996).
SPIE, vol. 2409, “A Viewpoint Dependent Stereoscopic Display Using Interpolation of Multi-Viewpoint Images”, A. Katayama, et al., pp. 11-20 (1995).
International Journal of Computer Vision, “Epipolar-Plane Image Analysis: An Approach To Determining Structure From Motion”, pp. 7-55 (1987).
Signal Processing Image Communication, vol. 9, “Intermediate View Synthesis Considering Occluded And Ambiguously Referenced Image Regions”, J. McVeigh, et al., pp. 21-28 (1996).

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Image processing method and apparatus does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Image processing method and apparatus, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Image processing method and apparatus will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2541658

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.