Image analysis – Applications – Target tracking or detecting
Reexamination Certificate
2008-04-08
2008-04-08
Johns, Andrew W. (Department: 2624)
Image analysis
Applications
Target tracking or detecting
C382S294000, C345S520000
Reexamination Certificate
active
07356164
ABSTRACT:
Techniques for computing a globally consistent set of image feature correspondences across a wide range of viewpoints suitable for interactive walkthroughs and visualizations. The inventive approach takes advantage of the redundancy inherent in a dense set of images captured in a plane (or in higher dimensions, e.g., images captured in a volume, images captured over time, etc). The technique may detect features in a set of source images and track the features to neighboring images. When features track to the same position in the same image, they are flagged as potential correspondences. Among the potential correspondences, the technique selects the maximal set using a greedy graph-labeling algorithm (e.g., best-first order). Only correspondences that produce a globally consistent labeling are selected. After globalization is done, a set of features common to a group of images can be quickly found and used to warp and combine the images to produce an interpolated novel view of the environment.
REFERENCES:
patent: 5818959 (1998-10-01), Webb et al.
patent: 6078701 (2000-06-01), Hsu et al.
patent: 6084592 (2000-07-01), Shum et al.
patent: 6097394 (2000-08-01), Levoy et al.
patent: 6118474 (2000-09-01), Nayar
patent: 6192156 (2001-02-01), Moorby
patent: 6400830 (2002-06-01), Christian et al.
patent: 6633317 (2003-10-01), Li et al.
patent: 6724915 (2004-04-01), Toklu et al.
patent: 6795090 (2004-09-01), Cahill et al.
patent: 6975755 (2005-12-01), Baumberg
patent: 7146022 (2006-12-01), Masukura et al.
patent: 2002/0164067 (2002-11-01), Askey et al.
patent: 2002/0172413 (2002-11-01), Chen
patent: 1063614 (2000-06-01), None
Lhuillieret al, “Image interpolation by joint view triangulation”, Computer Vision and Pattern Recognition, 1999. IEEE Computer Society Conference on. vol. 2, Jun. 23-25, 1999 Page(s).
Shi et al,“Good Features to Track”, Technical Report: TR93-1399, Year of Publication: 1993.
Kang, “A Survey of Image-based Rendering Techniques” , http://www.hpl.hp.com/techreports/Compaq-DEC/CRL-97-4.pdf, 1997.
Chen, “QuickTime VR: an image-based approach”International Conference on Computer Graphics and Interactive Techniques archiveProceedings of the 22nd annual conference on Computer graphics and interactive techniques table of contents pp. 29-38, 1995.
Debevee, “Modeling and Rendering Architecture from Photographs: A hybrid geometry- and image-based approach”, Proceedings of the 23rd annual conference on Computer graphics and interactive techniques table of contents pp. 11-20, 1996.
U.S. Appl. No. 10/452,314, filed May 30, 2003, “Method and Apparatus for Compressing and Decompressing Images Captured from Viewpoints Throughout N-Dimensional Space”.
U.S. Appl. No. 10/452,020, filed May 30, 2003, “Method and System for Creating Interactive Walkthroughs of Real-World Environment from Set of Densely Captured Images”.
U.S. Appl. No. 10/449,929, filed May 30, 2003, “Method and Apparatus for Computing Error-Bounded Position and Orientation of Panoramic Cameras in Real-World Environments”.
U.S. Appl. No. 10/156,189, filed Jun. 29, 2002, “Camera Model and Calibration Procedure for Omnidirectional Paraboloidal Catadioptric Cameras”.
U.S. Appl. No. 10/122,337, filed Apr. 16, 2002, “Method and System for Reconstructing 3D Interactive Walkthroughs of Real-World Environments”.
Shree K. Nayar, “Catadioptric Omnidirectional Camera,” IEEE Computer Vision and Pattern Recognition (CVPR '97), 7 pages, 1997.
Henry Fuchs, “On Visible Surface Generation by a Priori Tree Structures,” Computer Graphics, Proceedings of SIGGRAPH '890, pp. 124-133, 1980.
B. Triggs et al., “Bundle Adjustment—A Modern Synthesis,” Vision Algorithms: Theory and Practice, Springer-Verlag, pp. 1-71, 2000.
J. Shi et al., “Good Features to Track,” Proceedings of IEEE on Computer Vision and Pattern Recognition (CVPR94), Seattle, 8 pages, Jun. 1994.
E.H. Adelson et al., “The Plenoptic Function and the Elements of Early Vision,” Computational Models of Visual Processing, MIT Press, pp. 1-20, 1991.
M. Levoy et al., “Light Field Rendering,” Proceedings of ACM SIGGRAPH 96, pp. 1-12, 1996.
S. Teller et al., “Calibrated Registered Images of an Extended Urban Area,” IEEE Computer Vision and Pattern Recognition (CVPR), pp. 1-20, 2001.
R. Koch et al., “Calibration of Hand-held Camera Sequences for Plenoptic Modeling,” IEEE International Conference on Computer Vision (ICCV), 7 pages, 1999.
Camillo J. Taylor, “VideoPlus,” IEEE Workshop on Omnidirectional Vision, 8 pages, Jun. 2000.
L.S. Nyland et al., “Capturing, Processing and Rendering Real-World Scenes,” Videometrics and Optical Methods for 3D Shape Measurement Techniques, Electronic Imaging Photonics West, vol. 2309, 9 pages, 2001.
Aliaga Daniel G.
Carlbom Ingrid Birgitta
Funkhouser Thomas A.
Yanovsky Dimah V.
Allison Andrae
Johns Andrew W.
Lucent Technologies - Inc.
LandOfFree
Method and apparatus for finding feature correspondences... does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Method and apparatus for finding feature correspondences..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for finding feature correspondences... will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-2764743