Image processing method, image processing apparatus, and...

Image analysis – Image transformation or preprocessing – Changing the image coordinates

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S299000, C345S582000, C358S001200

Reexamination Certificate

active

06674922

ABSTRACT:

FIELD OF THE INVENTION
The present invention relates to an image processing apparatus and method for reconstructing an image, which records image information appended with quantized space information such as image information expressed by, e.g., a ray space theory, in a virtual space and, more particularly, to an improvement in resolution.
BACKGROUND OF THE INVENTION
Attempts to describe a virtual space on the basis of a ray space theory have been proposed. See, for example, “Implementation of Virtual Environment by Mixing CG model and Ray Space Data”, IEICE Journal D-11, Vol. J80-D-11 No. 11, pp. 3048-3057, November 1997, or “Mutual Conversion between Hologram and Ray Space Aiming at 3D Integrated Image Communication”, 3D Image Conference, and the like.
A recording method of ray space data will be explained below.
As shown in
FIG. 1
, a coordinate system
0
-X-Y-Z is defined in a real space. A light ray, perpendicular to the Z-axis that passes through a reference plane P (Z=z) is defined by a position (x, y) where the light ray crosses P, and variables &thgr; and &phgr; that indicate the direction of the light ray. More specifically, a single light ray is uniquely defined by five variables (x, y, z, &thgr;, &phgr;). If a function that represents the light intensity of this light ray is defined as f, light ray group data in this space can be expressed by f(x, y, z, &thgr;, &phgr;). This five-dimensional space is called a “ray space”.
If the reference plane P is set at z=0, and disparity information of a light ray in the vertical direction, i.e., the degree of freedom in the &phgr; direction is omitted, the degree of freedom of the light ray can be regenerated in two dimensions (x, &thgr;). This x-&thgr; two-dimensional space is a partial space of the ray space. As shown in
FIG. 3
, if u=tan&thgr;, a light ray (
FIG. 2
) which passes through a point (X, Z) in the real space is mapped onto a line in the x-u space, said line is given by:
X=x+uZ
  (1)
Image sensing by a camera reduces to receiving light rays that pass through the focal point of the lens of the camera onto an image sensing surface, and converting their brightness levels and colors into image signal. In other words, a light ray group which passes through one point, i.e., the focal point, in the real space is captured as an image represented by a number of pixels. Since the degree of freedom in the &phgr; direction is omitted, and the behavior of a light ray is examined in only the X-Z plane, only pixels on a line segment that intersects a plane orthogonal with respect to the Y-axis need to be considered. In this manner, by sensing an image, light rays that pass through one point can be collected, and data on a single line segment in the x-u space can be captured by a single image sensing.
When this image sensing is done a large number of times by changing the view point position, light ray groups which pass through a large number of points can be captured. When the real space is sensed using N cameras, as shown in
FIG. 4
, data on a line given by:
x+Z
n
u=X
n
  (2)
can be input in correspondence with a focal point position (X
n
, Z
n
) of the n-th camera (n=1, 2, . . . , N), as shown in FIG.
5
. In this way, when an image is sensed from a sufficiently large number of view points, the x-u space can be densely filled with data.
Conversely, an observed image from a new arbitrary view point position can be generated (
FIG. 7
) from the data of the x-u space (FIG.
6
). As shown in
FIG. 7
, an observed image from a new view point position E(X, Z), indicated by an eye mark, can be generated by reading out data on a line given by equation (1) from the x-u space.
One major feature of ray space data is that ray space data is defined for each pixel. That is, frame data for one scene is expressed by only ray space data. Hence, the data size of ray space data does not depend on the complexity of a scene, but depends on only the size and resolution of the scene. For this reason, when a given scene is complicated, normal CG data cannot express that complexity unless the number of polygons is large (an increase in the number of polygons leads to an increase in computation volume). However, ray space data does not require an increase in data size.
In other words, image data such as ray space data in which image information includes space information requires an image process for pixels for one frame to reconstruct an image.
However, when an enlarged image is reconstructed from ray space data or when the field angle is decreased to be smaller than that of the original image sensing angle, since the data size of ray space data remains the same, a plurality of pixels must refer to the same ray space data. This is caused by the enlargement of an image from ray space data witch requires processes for new pixels to be added as a result of enlargement, as described above. However, reference to same ray space data, although necessary is wasteful process.
SUMMARY OF THE INVENTION
The present invention has been proposed to solve conventional problems, and has as its object to provide an image processing apparatus and method, which can reconstruct an enlarged image at high speed without producing any wasteful pixel reference when an enlarged image is reconstructed from image data such as ray space data in which image information includes space data or when the field angle is decreased to be smaller than that upon image sensing.
In order to achieve the above object, an image processing apparatus of the present invention comprises:
recording means which has recorded in units of pixels image information appended with space information that has been quantized at a predetermined first resolution;
setting means for setting a second resolution higher than the first resolution; and
sample/interpolation means for sampling the image information of the first resolution and assigning the sampled image information, and interpolating pixels at empty pixel positions which are produced due to a difference between the second and first resolutions, upon reading out image information from the recording means at the second resolution.
Further, a method of the present invention that achieves the above object is an image processing method for reconstructing an image based on image information read out from a recording step which has recorded in units of pixels image information appended with space information that has been quantized at a predetermined first resolution, comprising:
a setting step of setting a second resolution higher than the first resolution; and
a sample/interpolation step of sampling the image information at the first resolution and assigning the sampled image information, and interpolating pixels at empty pixel positions which are produced due to a difference between the second and first resolutions, upon reading out image information from the recording means at the second resolution.
As a preferred aspect of the present invention, the image information is expressed by a ray space theory.
As a preferred aspect of the present invention, when ray space data has different resolutions in different coordinate axis directions, recorded ray space data are sampled at a lower resolution.
As a preferred aspect of the present invention, the sample/interpolation means interpolates pixels by texture mapping.
As a preferred aspect of the present invention, the sample/interpolation means pastes texture data by enlarging the texture data by a magnification of the second resolution/first resolution.
The above object can also be achieved by a program for implementing the image processing method stored in a storage medium.


REFERENCES:
patent: 4865423 (1989-09-01), Doi
patent: 5317689 (1994-05-01), Nack et al.
patent: 5420788 (1995-05-01), Vissers
patent: 5933146 (1999-08-01), Wrigley
patent: 6097394 (2000-08-01), Levoy et al.
patent: 6108105 (2000-08-01), Takeuchi et al.
patent: 6256035 (2001-07-01), Katayama et al.
patent: 6313846 (2001-11-01), Fenney et al.
patent: 6400365 (2002-06-

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Image processing method, image processing apparatus, and... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Image processing method, image processing apparatus, and..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Image processing method, image processing apparatus, and... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3258700

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.