Apparatus and method for determining stereo disparity based...

Image analysis – Applications – 3-d or stereo imaging analysis

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S419000, C345S420000, C345S421000, C345S422000, C345S423000, C345S424000, C345S426000, C345S427000, C356S012000, C356S013000, C356S014000, C356S015000, C356S016000, C356S017000, C356S018000, C356S019000, C356S020000, C356S021000, C356S022000

Reexamination Certificate

active

07570804

ABSTRACT:
Provided is an apparatus and method for determining stereo disparity based on two-path dynamic programming and GGCP. The apparatus includes a pre-processing unit for analyzing texture distribution of an input image by using a Laplacian of Gaussian (LOG) filter and dividing the input image into a homogeneous region and a non-homogeneous region; a local matching unit for determining candidate disparities to be included in an each pixel of all pixels; a local post-processing unit for removing candidate disparities in a pixel of low reliability by performing a visibility test betweens candidate disparities in each pixel to improve the reliability of the candidate disparity; and a global optimizing unit for determining a final disparity for candidate disparities in an each pixel by performing a dynamic programming.

REFERENCES:
patent: 4197583 (1980-04-01), Westell et al.
patent: 5617459 (1997-04-01), Makram-Ebeid et al.
patent: 5691773 (1997-11-01), Wang et al.
patent: 5917936 (1999-06-01), Katto
patent: 6009190 (1999-12-01), Szeliski et al.
patent: 6076010 (2000-06-01), Boas et al.
patent: 6457032 (2002-09-01), Silver
patent: 6571024 (2003-05-01), Sawhney et al.
patent: 6591004 (2003-07-01), VanEssen et al.
patent: 6701005 (2004-03-01), Nichani
patent: 6744923 (2004-06-01), Zabih et al.
patent: 6954544 (2005-10-01), Jepson et al.
patent: 6999620 (2006-02-01), Harville
patent: 7004904 (2006-02-01), Chalana et al.
patent: 7085409 (2006-08-01), Sawhney et al.
patent: 7292735 (2007-11-01), Blake et al.
patent: 7330584 (2008-02-01), Weiguo et al.
patent: 7365731 (2008-04-01), Chiu et al.
patent: 2002/0025075 (2002-02-01), Jeong et al.
patent: 2002/0097459 (2002-07-01), Hart
patent: 2004/0264763 (2004-12-01), Mas et al.
patent: 2005/0089199 (2005-04-01), Marschner et al.
patent: 2006/0056029 (2006-03-01), Ye
patent: 2006/0061583 (2006-03-01), Spooner et al.
patent: 2006/0104542 (2006-05-01), Blake et al.
patent: 2006/0120594 (2006-06-01), Kim et al.
D. Scharstein and R. Szeliski. A Taxonomy and evaluation of dense two-frame stereo correspondence algorithms. Technical Report MSR-TR-2001-81, Microsoft Research, 2001.
Comparison of graph cuts with belief propagation for stereo, using identical MRF parameters, Tappen, M.F., Freeman, W.T., Comput. Sci. & Artificial Intelligence Lab., Massachusetts Inst. of Technol., Cambridge, MA, USA; Computer Vision, 2003. Proceedings. Ninth IEEE International Conference on Publication Date: Oct. 13-16, 2003 p. 900-906 vol. 2.
“Sampling the Disparity Space Image” (IEEE Transactions on Patten Analysis and Machine Intelligence, vol. 25, No. 3, Mar. 2004, Szeliski and Scharstein).
Y. Boykov, O. Veksler and R. Zabih, “Fast approximate energy minimization via graph cuts”, IEEE Trans. On Pattern Analysis and Machine Intelligence, vol. 23, No. 11, pp. 1222-1239, 2001.
P. Fua, “A parallel stereo algorithm that produces dense depth maps and preserves image features”, Machine Vision and Applications, pp. 35-49, 1993.
M.L. Gong and Y.H. Yang, “Fast stereo matching using reliability-based dynamic programming and consistency constraints”, In Proc. Int. Conf. on Computer Vision, pp. 610-617, 2003.
S.B. Kang, R. Szeliski, “Extracting view-dependent depth maps from a collecting of images”, International Journal of Computer Vision, 58(2), pp. 139-163, Jul. 2004. [12] M. Kass, “Computing visual correspondence”, In Proc. Image Understanding Workshop, pp. 54-60, Jun. 1983.
M. Okutomi, Y. Katayama and S. Oka, “A simple stereo algorithm to recover precise object boundaries and smooth surfaces”, International Journal of Computer Vision, 47(1/2/3), pp. 261-273, 2002.
Y. Wei and L. Quan, “Region-based progressive stereo matching”, In Proc. Conf. on Computer Vision and Pattern Recognition, vol. 1, pp. 106-113, 2004.
T.E. Zickler, J. Ho, D.J. Kriegman, J. Ponce, and P.N. Belhumeur, “Binocular Helmholtz stereopsis”, In Proc. Int. Conf. on Computer Vision, vol. 2, pp. 1411-1417, 2003.
Jae Chul Kim et al., A Dense Stereo Matching Using Two-Pass Dynamic Programming with Generalized Ground Control Points Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05)—vol. 2-vol. 02 pp. 1075-1082.
Edgar Arce et al., “High-precision stereo disparity estimation using HMMF models Image and Vision Computing” vol. 25, Issue 5 (May 2007) pp. 623-636 Year of Publication: 2007.
Stereo by Intra- and Inter-Scanline Search Using Dynamic Programming Y. Ohta and T. Kanade IEEE Trans. Pattern Analysis and Machine Intelligence, vol. PAMI-7, No. 2, Mar. 1985, pp. 139-154.
Jones, D.G. et al, “A Computational Framework for Determining Stereo Correspondence from a Set of Linear Spatial Filters,” European Conference on Computer Vision, 1992, pp. 395-410.
Intille, S.S. et al, “Large Occlusion Stereo,” In Vismod, 1999.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Apparatus and method for determining stereo disparity based... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Apparatus and method for determining stereo disparity based..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Apparatus and method for determining stereo disparity based... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4105233

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.