Camera calibration apparatus and method, image processing...

Television – Monitoring – testing – or measuring – Testing of camera

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S289000

Reexamination Certificate

active

06816187

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a camera calibration method and apparatus for calculating a parameter representative of a characteristic of a camera, and more particularly to a camera calibration method and apparatus for calculating a parameter of a camera which is of a type picking up an image of a subject to output electronic image data.
More concretely, the invention relates to a camera calibration method and apparatus capable of accomplishing a stable estimation of a parameter with high accuracy on the basis of one picked-up image.
2. Description of the Related Art
With the recent progress of image processing technology, general-purpose computer systems, exhibiting high-level functions and great arithmetic processing ability, have come into widespread use, for example, among research organizations, enterprise offices and general homes. In addition, the computer application field has enlarged, and not only computer data but also other data including images and voices are translated into an electronic form computers can handle. For example, electronic image data captured through an image pickup means, such as a digital camera, and then read into a computer can diversely be processed through the use of computer resources for image combinations, image deformation and others.
Most of the existing cameras perform central projection through the use of a pinhole camera model. This central projection signifies that a color density of a point P on a surface of a three-dimensional object is put at the intersection between a straight line (also referred to as “line of sight”) connecting a projection center C with the point P on the object surface and a projection screen of the camera, and is for forming a projected image. In the case of the central projection, regardless of size identification of an object, as the object approach the projection center C of the camera, larger image projection takes place. On the other hand, as it recede from the projection center C, smaller image projection occurs.
Furthermore, it is obvious from the geometric optics that an image taken (photographed) from an oblique direction with respect to the front of a subject becomes a projection image obtained by the projection conversion of an image taken from a position just facing the front thereof. The fact that the projection image is obtainable by the projection conversion of the front image according to a projective transformation matrix H has been well known in the technical field of image processing. For example, if a front image is electronic image data captured through a digital camera, when the captured front image undergoes projection conversion through the use of a computer resource, it is possible to easily calculate a projection image, equivalent to when taken from an arbitrary direction (light of sight), at a relatively high speed. For example, “Understanding of Image” 1990, written by Kenichi Kanaya and published by Morikita Shuppan, discloses that the original image is convertible into an image viewed at a different angle, through a projective transformation matrix.
The property on the geometric optics, related to the projective transformation, also applies to, for example, a method of measuring a distance from an object according to the “stereo-method”. Here, the “stereo-method” signifies a method of measuring the distances between points in a scene, that is, in a picked-up image, and the projection centers through the use of images taken from a plurality of station (view) points (projection centers) having predetermined positional relation to each other according to the so-called “triangulation” principle.
In this specification, for convenience in description, the stereo-method will be conducted with two station points, that is, two cameras. One camera is used as a base camera, and is for picking up an image of a subject from a position right opposed to the front to output a base image. The other camera is a reference camera, and is for capturing an image of the subject from an oblique direction to issue a reference image.
FIG. 10
illustratively shows the locations of a base camera and a reference camera with respect to a subject, and
FIG. 11
illustratively shows a base image of a generally square pattern and a reference image thereof taken through the use of the base camera and the reference camera, respectively.
As
FIG. 10
shows, a point P appears at the intersection n
b
between a straight line connecting a point P on a plane forming a subject with a projection center C
b
of the base camera and a projection screen S
b
of the base camera. The straight line for the connection between the point P and the projection center C
b
of the base camera is a line of sight of the base camera. Additionally, a point P appears at the intersection n
d
between a straight line connecting the point P with a projection center C
d
of the reference camera and a projection screen S
d
of the reference camera. The straight line for the connection between the point P and the projection center C
d
of the reference camera is a line of sight of the reference camera.
When undergoing a projective transformation, the line of sight of the base camera becomes the line of sight of the reference camera. The projective transformation is described with a projective transformation matrix H. The line of sight of the base camera is observed as a straight line on the projection screen of the reference camera, and this straight line is called “epipolar line”.
Furthermore, as
FIG. 11
shows, a picked-up image taken by the base camera right facing the generally square pattern becomes square. An image taken by the reference camera viewing this pattern from an oblique direction, by contrast, appears as a trapezoidal form because of the reduction of a side at a longer distance from the station point. This depends upon the basic characteristic of the central projection that, regardless of size identification of an object, as the object approaches the projection center C of a camera, the effect is a projection of a larger image, and as it recedes from the projection center C, the effect is a projection of a smaller image.
As mentioned above, the picked-up image I
d
by the reference camera equals an image resulting from the projective transformation of a picked-up image I
b
by the base camera. That is, the relationship between a point n
b
(x
b
, y
b
) in the picked-up image I
b
by the base camera and the corresponding point n
d
(x
d
, Y
d
) in the picked-up image I
d
by the reference camera is given by the following equation, where H represents a 3×3 projective transformation matrix.
[Equation 1]
n
d
=H·n
b
The projective transformation matrix H is a matrix tacitly containing internal parameters and external parameters of a camera and a plane equation, and has eight degrees-of-freedom because the degree of freedom stays in a scale factor. Incidentally, the “Understanding of Image” 1990, written by Kenichi Kanaya (published by Morikita Shuppan), says that the corresponding points between a base image and a reference image is obtainable through the projective transformation.
The line of sight of the base camera appears as a straight line, called the “epipolar line”, on the projection screen S
d
of the reference camera (refer to the above description and FIG.
10
). The point P existing on the line of sight of the base camera appears on the same observation point n
b
in the projection screen S
b
of the base camera, irrespective of the depth of the point P, that is, the distance thereof from the base camera. On the other hand, the observation point n
d
for the point P on the projection screen S
d
of the reference camera appears on the. epipolar line in accordance with the distance between the base camera and the point P.
FIG. 12
is an illustration of a state of the observation point n
d
on the projection screen S
d
of the reference camera. As illustrated in
FIG. 12
, as the position of the point P shifts from P
1
through P
2
to P
3
, t

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Camera calibration apparatus and method, image processing... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Camera calibration apparatus and method, image processing..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Camera calibration apparatus and method, image processing... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3291090

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.