Optics: measuring and testing – Velocity or velocity/height measuring – With light detector
Reexamination Certificate
2002-03-18
2004-01-27
Tarcza, Thomas H. (Department: 3662)
Optics: measuring and testing
Velocity or velocity/height measuring
With light detector
C382S107000, C348S700000, C396S055000
Reexamination Certificate
active
06683677
ABSTRACT:
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to an apparatus for calculating optical flow and camera motion using correlation matching and a system model in a moving image, and more particularly, to an apparatus for calculating optical flow and camera motion using correlation matching and a system model in a moving image, in which after the optical flow is acquired from a sequence of moving images, the camera motion is calculated.
2. Background of the Related Art
Up to now, in order to calculate an optical flow by extracting feature points in a sequence of moving images, a gradient-based approach, a frequency-based approach, and a correlation-based approach have been proposed.
The Hron and Schunck algorithm is typical of the gradient-based approach.
According to such an approach, a pixel point is found with a value that is minimized according to a variation of a peripheral pixel gray value and a variation of a gray value between image frames. The greatest disadvantage of the algorithm is that since the gray value existing in a current frame has to be in a next frame and an object may have moved, it is difficult to apply this approach to quick camera motion and thus to the system in real time.
The frequency-based approach of calculating the optical flow uses a differential value of all of pixel values in the image by employing a band-pass filter for a velocity such as a Gabor filter. This approach also has the same disadvantage as that of the gradient-based approach.
The correlation-based approach is applied to a method of searching a moving object in an MPEG image. The approach has many errors when the image is rotated or a zoom level change occurs, such that an auxiliary method is then required.
Thus, according to the existing methods it is not possible to calculate optical flow in real time, because of calculating the optical flow for all the pixels in the image. The optical flow calculation using correlation matching has a drawback in that, upon the occurrence of an image rotation or zoom change, errors occur in the calculation of the optical flow.
SUMMARY OF THE INVENTION
Accordingly, the present invention is directed to an apparatus for calculating optical flow and camera motion using correlation matching and a system model in a moving image that substantially obviates one or more problems due to limitations and disadvantages of the related art.
An object of the present invention is to provide an apparatus for calculating optical flow and camera motion using correlation matching and a system model in a moving image, in which optical flow is calculated in real time, and may be calculated accurately even when an image rotation or zoom change occurs.
To achieve these objects and other advantages, the present invention employs a method of calculating the optical flow on the basis of a correlation-based approach and using an estimated value, whereby the optical flow calculated in a previous frame is to be positioned in a current frame, on the basis of the system model and the camera motion calculated in the previous frame, upon the occurrence of an image rotation or zoom level change.
In addition, since real time processing is impossible when the optical flow of all of the pixels is calculated, the number of optical flows is controlled so as not to be increased above a certain number by use of a SUSAN edge operation. When calculating the optical flow, the method of combining a correlating matching value with an optical flow location estimated value is characterized in that a correlation matching weighted value is reduced depending upon a rotating and moving value of an optical axis of a CCD camera.
According to one aspect of the present invention, there is provided an apparatus for calculating optical flow and camera motion using correlation matching and a system model in moving images. The apparatus includes a feature point extracting section for extracting a feature point of a sequence of input images (previous image and current image); an optical flow calculating section for calculating an optical flow by use of the feature points extracted by the feature point extracting section; a first camera motion calculating section for calculating a camera motion by use of the optical flow calculated by the optical flow calculating section; a second camera motion calculating section for eliminating an incorrect optical flow among the optical flows calculated by the optical flow calculating section and recalculating camera motion; an optical flow location estimating section for estimating a location of the optical flow by estimating a distance difference between an estimated feature point location and a current feature point location in the previous image according to the camera motion calculated by the second camera motion calculating section; and a weighted value calculating section for calculating a weighted value according to the camera motion calculated by the second camera motion calculating section and providing the weighted value to the optical flow calculating section.
The feature point extracting section comprises a SUSAN edge driving portion for extracting an edge image from the sequence of the input images, and a local max portion for selecting a largest value in a mask size region set in the edge image extracted from the SUSAN edge extracting portion to extract a certain number of feature points. The feature points are extracted by the local max portion in accordance with:
S
=
{
∑
e
⁢
⁢
-
(
I
⁡
(
x
,
y
)
-
I
⁡
(
x
+
dx
,
y
+
dy
)
T
)
6
}
/
G
wherein, I is a gray value, T is a threshold value for a difference between the gray values, and G is an edge strength difference between feature point locations of the previous and current images.
The optical flow calculating section comprises a combining subsection for calculating a connecting strength of an edge by use of the information relating to the distance difference between the current feature point location and the estimated feature point location in the previous image provided from the optical flow location estimating section, the weighted values of the correlation, location incorrectness, and edge strength matching provided from the feature point extracting section, and the edge strength difference between the feature points of the previous and current images provided from the feature point extracting section; and a matched feature point extracting subsection for extracting the feature points having the largest connecting strength by use of the connecting strength value provided from the combining subsection and providing the extracted feature points to the first and second camera motion calculating sections. The edge connecting strength E is calculated by the combining subsection in accordance with:
E=W
G
G+W
Cl
Cl+W
S
S
wherein, W
G
is a weighted value of a correlation matching, W
Cl
is a weighted value of a location error matching, W
S
is a weighted value of an edge strength matching, G is an edge strength difference between the feature point locations in the previous and current images provided from the feature point extracting section, Cl is a distance difference between the current feature point location and the estimated feature point location in the previous image provided from the optical flow location estimating section, and S is a SUSAN edge strength difference between the feature points of the previous and current images provided from the feature point extracting section.
The first camera motion calculating section comprises a pseudo inverse matrix calculating subsection for calculating a constant of a camera projection formula by use of the optical flow provided from the optical flow calculating section; and a first camera motion calculating subsection for dividing the camera motion by use of the constant of the camera projection formula provided from the pseudo inverse matrix calculating subsection, and providing the resultant to the second camera motion calculating section.
The second camera motion calculating section compri
Cho Seong Ik
Chon Jae Choon
Kim Kyung Ok
Lim Young Jae
Yang Young Kyu
Andrea Brian
Electronics and Telecommunications Research Institution
Jacobson & Holman PLLC
Tarcza Thomas H.
LandOfFree
Apparatus for calculating optical flow and camera motion... does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Apparatus for calculating optical flow and camera motion..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Apparatus for calculating optical flow and camera motion... will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-3240922