System and method for estimating the epipolar geometry...

Image analysis – Applications – 3-d or stereo imaging analysis

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S285000, C382S294000, C345S427000, C345S473000, C348S042000

Reexamination Certificate

active

06771810

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates in general to object modeling, and in particular to a system and method for estimating the epipolar geometry of images representing an object that are used to model the object with image processing systems that require the epipolar geometry for modeling the object.
2. Related Art
Many stereo and computer vision applications that model three dimensional (3D) objects base the 3D modeling on several two dimensional (2D) images and a fundamental matrix that embodies the epipolar geometry between the images. The fundamental matrix is a crucial determination in object modeling.
For example, typically, in these systems, it is assumed that at least a pair of two dimensional images of the three dimensional object or environment are taken from two distinct viewpoints. In operation, these system usually first obtain two-dimensional images of a three-dimensional object and then obtain a fundamental matrix between the two images. In other words, the epipolar geometry of the two images is usually determined first and corresponding points between the two images typically need to satisfy an epipolar constraint before performing additional steps for modeling the object.
Current systems that perform estimation of the fundamental matrix typically require consideration of 36 distinct parameterizations. These distinct 36 parameterizations are considered because an epipole may be at infinity and an element of the epipolar transformation may be equal to 0, thus, requiring 36 maps to parameterize the fundamental matrix. However, these systems are time consuming and lead to cumbersome implementation of optimization procedures.
Therefore, what is needed is a system and method for estimating the fundamental matrix without using 36 maps to parameterize the fundamental matrix. What is also needed is a system and method for transforming image points of the image in projective space, so that nonlinear optimization is performed with one parameterization of the fundamental matrix. What is further needed is a system and method for estimating the fundamental matrix that preserves the characteristics of the data noise model from the original image space.
SUMMARY OF THE INVENTION
To overcome the limitations in the prior art described above, and to overcome other limitations that will become apparent upon reading and understanding the present specification, the present invention is embodied in a system and method for estimating epipolar geometry, in terms of a fundamental matrix, between multiple images for stereo vision processing. The fundamental matrix embodies the epipolar geometry between the images.
In general, the present invention includes a method for estimating epipolar geometry between multiple images of an original space given an initial estimate of the fundamental matrix found using a standard linear estimation technique. Namely, the system and method of the present invention estimates the fundamental matrix by transforming image points of multiple images into projective space. After this transformation is performed, nonlinear optimization is used with one parameterization of the fundamental matrix, rather than considering 36 distinct parameterizations, as in previous methods. The images are then inverse transformed back to the original space with a final estimate of the fundamental matrix.
Specifically, one method of the present invention involves first identifying an element of the fundamental matrix with the largest absolute value. Second, two appropriate permutation matrices (projective transformations) of the image points are constructed such that the element with the largest absolute value of the fundamental matrix is at location (0,0) of the matrix. Third, the coordinates of each image point are permuted accordingly. Fourth, the fundamental matrix in the permutated space is estimated using a non-linear estimation technique. Last, an inverse permutation is performed to the estimated fundamental matrix of the original space. In addition, the present invention includes a method for preserving the characteristics of a data noise model from the original image space.
The present invention as well as a more complete understanding thereof will be made apparent from a study of the following detailed description of the invention in connection with the accompanying drawings and appended claims.


REFERENCES:
patent: 5821943 (1998-10-01), Shashua
patent: 6516099 (2003-02-01), Davison et al.
patent: 6614429 (2003-09-01), Zhang et al.
Adelson, E.H. Perceptual organization and the judgement of brightness. Science, 262:2042-2044, Dec. 24, 1993.
Adelson, E.H. and P. Anandan. Ordinal characteristics of transparency. In AAAI-90 Work. Qualitative Vision, pp. 77-81, 1990.
Baker, S., R. Szeliski, and P. Anandan. A layered approach to stereo reconstruction. In CVPR '98, pp. 434-441, Jun. 1998.
Bergen, J.R., P. Anandan, K.J. Hanna, and R. Hingorani. Hierarchical model-based motion estimation. In ECCV'92, pp. 237-252, Italy, May 1992.
Bergen, J.R., P.J. Burt, R. Hingorani, and S. Peleg. A three-frame algorithm for estimating two-component image motion. IEEE Trans. Patt. Anal. Mach. Intel., 14(9):886-896, Sep. 1992.
Black, M.J., and A. Rangarajan. On the unification of line processes, outlier rejection, and robust statistics with applications in early vision.Intl. J. Comp. Vision, 19(1):57-91, 1996.
Blinn, J.F., Jim Blinn's corner: Compositing, part 1: Theory.IEEE Computer Graphics and Applications, 14(5):83-87, Sep. 1994.
Darrel, T., and E. Simoncelli. “Nulling” filters and the separation of transparent motion. In CVPR '93, pp. 738-739, 1993.
Debevec, P.E., and J. Malik. Recovering high dynamic range radiance maps from photographs. SIGGRAPH '97, pp. 359-378, Aug. 1997.
Hsu, P.R., P. Anandan, and S. Peleg. Accurate computation of optical flow by using layered motion representation. In ICPR '94, pp. 743-746, Oct. 1994.
Irani, M., B. Rousso, and S. Peleg. Computing occluding and transparent motion. Int. J. Comp. Vis., 12(1):5-16, Jan. 1994.
Ju, S.X., M.J. Black, and A.D. Jepson. Skin and bones: Multi-layer, locally affine, optical flow and regularization with transparency. In CVPR'96, pp. 307-314, Jun. 1996.
Nayar, S.K., S.X. Fang, and T. Boult. Separation of reflection components using color and polarization. Int. J. Comp. Vis., 21:163-186, 1997.
Sawhney, H.S., and S. Ayer. Compact representations for videos through dominant and multiple motion estimation.PAMI, 18(8):814-830, 1996.
Shizawa, M. and K. Mase. A unified computational theory of motion transparency and motion boundaries based on eigenen-ergy analysis. In CVPR'91, pp. 289-295, Jun. 1991.
Shum, H.-Y., and R. Szeliski. Construction and refinement of panoramic mosaics with global and local alignment. InSixth International Conference on Computer Vision(ICCV98), pp. 953-958, Bombay, Jan. 1998.
Wang, J.Y.A., and E.H. Adelson. Representing moving images with layers. IEEE Trans. Im. Proc., 3(5):625-638, Sep. 1994.
Weiss, Y. Smoothness in layers: Motion segmentation using nonparametric mixture estimation. InCVPR'97, pp. 520-526, San Juan, Puerto Rico, Jun. 1997.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

System and method for estimating the epipolar geometry... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with System and method for estimating the epipolar geometry..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and System and method for estimating the epipolar geometry... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3310865

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.